I am not against the concept of “Big Data”, but I am against the
concept of collecting data just because you can. The data does
need to have a use. I take this position because you also need
to verify and confirm your data. Also you have to monitor and
maintain the instruments and systems that collect that data. Without
this last point you can get into some unusual issues.
I ran into an interesting case where lack of verification and
maintenance cause data to be in conflict with itself. I was
helping a plant solve a problem with the draft on a rotary kiln.
Two of the engineers were arguing over some data that showed a
pressure spike in the duct that seemed related to the upsets.
One was reporting the spike occurred right after the upset, the
other was saying right before the upset.
Both had graphs showing their information. The arguments got
going strong, when it was suggested to bring up the information on
the conference room screen. First one brought up his then the
other brought up his, and they were in sync approximately five
minutes apart. But it was then noticed that they were both looking
at the same instrument reading.
Shall we say that the consternation was a bit high with several
cries of “that’s impossible”. The I&E engineer and techs said
they would look at this immediately and report back the next day.
Turns out that two separate PLC’s were reading and recording the
same data point. The internal clocks of both were off, one two
minutes fast, the other three minutes slow. The spike was
actually occurring right at the upset.
Oh and it turns out the problem was a bad bearing mount on the
exhaust fan. Found by a mechanic who was walking by just as it
squealed and the fan stuttered.
A recent story in the Wall Street Journal (1) highlighted this,
where a piece of large mining equipment (a Joy Continuous Miner) was
having problems with one of its drive motors. The operation
thought that a control system was at fault. Repairing it was a
major effort. The equipment vendor (Joy Global) had access to
the data remotely, and was able to indicate that the problem was a
heat exchange unit, which was much easier to repair.
As I have said before, I am not against the concept of “Big Data”
but I am against the concept of collecting data just because you
can. Once the date has been collected it needs to be collated,
analyzed, and summarized, or as we said in the pre-electronic media
age - Folded, Spindled and Mutilated (2).
With the advent of improved instrumentation and faster and more
powerful data systems, the collection of the data is easier than
ever. And it looks to become easier and easier to collect in
the future.
Not many operations have the luxury of a full time staff that can
look at and tell what all the data means. This becomes even
more the case with large and special equipment. The operation
knows how to run it and maintain it (hopefully). But may not
be knowledgably in what some of the instrument readings mean.
Making use of outside sources to help monitor and analyze the data
is a way for operations to expand their capability with experts that
they would not otherwise have. And may become a key decision in
future when purchasing new equipment, the ability to remotely
monitor and analyze equipment data.
In mining keeping the plant operating on target is important, this
means keeping the plant on grade and meeting recovery targets.
Optimization is beating the grade and recovery targets or
increasing through put without losing grade and recovery.
Best is to increase through put and increasing grade and
recovery. But to do
this you have to know what you are doing and how everything is
working.
Many a plant operator has being running along keeping an eye on his
instrumentation when the financial people tell him his actual
production is off and where is the product he said he made.
At which point he will start digging through his data.
In most modern plants the one thing you probably have a lot of is
data. But is it any
good. As I mentioned in some earlier articles, your data is only as
good as how it is collected (https://www.linkedin.com/pulse/data-only-good-how-analyzed-mike-albrecht-p-e-)
and your data is only as good as how it is analyzed (https://www.linkedin.com/pulse/data-only-good-how-collected-mike-albrecht-p-e-).
But beyond collecting the right data and having the proper
tools to analyze the data, is the data actually any good.
But then what does good data mean.
Having good data will deal with the quality of the data. Key terms
are accuracy and precision.
As in the picture at the top of this article, your data can
very accurate and precise (the desired state) are some combination
thereof. While high accuracy and high precision is the goal, you
will often be at the other extreme.
Understanding what effects the accuracy and precision of your
data can help you understand it.
Having the correct tools to collect and analyze your data is very
important, as is understanding the accuracy and precision of your
data.
In general use the two words precision and accuracy are often
considered the same in
technical use they are different.
In science, engineering, industry, and statistics, accuracy
is the degree of closeness a set of data is to the quantity's true
value (or accepted true value.
The precision is related to reproducibility and
repeatability, or how repeated measurements under unchanged
conditions show the same results.
Accuracy and precision are often defined in terms of systematic and
random errors. The more common definition associates accuracy with
systematic errors and precision with random errors.
The system used to collect your data can be accurate but not
precise, precise but not accurate, neither, or both.
An example being if your measurement system contains a
systematic error (often called a bias) it can give a set of data
that is very repeatable and reproduces consistantly but is actually
off by a significant amount (low accuracy high precision).
Once this is found, it can be corrected.
For accuracy we can distinguish:
·
the difference between the mean of the measurements and the
reference value, the bias. Establishing and correcting for bias is
necessary for calibration.
·
the combined effect of that and precision.
For precision:
·
repeatability — the variation arising when all efforts are made to
keep conditions constant by using the same instrument and operator,
and repeating during a short time period; and
·
reproducibility — the variation arising using the same measurement
process among different instruments and operators, and over longer
time periods.
So, what does this mean.
First are your instruments actually reading what you think
they are and on top of that are they the right instruments for what
you want. Next
instruments are influenced by weather and age, and need to be
recalibrated on a regular basis.
Also they do not last forever.
Keeping on top of your plant requires knowing how wil your
instruments are doing so keep an eye on them.
o
40+
years’ experience in the mining industry with strong mineral
processing experience in Precious metals, copper, industrial
minerals, coal, and phosphate
o
Operational experience in precious metals, coal, and phosphate plus
in petrochemicals.
o
Extensive experience studies and feasibility in the US and
international (United States, Canada, Mexico, Ecuador, Columbia,
Venezuela, Chile, China, India, Indonesia, and Greece).