CORD-19:810a9bd2723d1cd3198e0dc24c8035d0fd2901d6 / 12382-12739 JSONTXT 5 Projects

The use of infrared thermography as an early indicator of bovine respiratory disease complex in calves Abstract Bovine respiratory disease (BRD) complex causes considerable distress to domestic livestock and economic hardship to the beef industry. Furthermore, the resulting extensive use of antimicrobial treatments is a growing concern from the perspective of facilitating antibiotic resistant microbes. The earlier detection of BRD would enable an earlier, more targeted treatment regime and earlier isolation of infected individuals. The objective of the present study was to investigate the use of non-invasive infrared thermography in the early detection of BRD in cattle. Studies were conducted on 133 head of weaned calves. Data demonstrated that infrared thermography was able to identify animals at early stages of illness, often several days to over one week before clinical signs were manifest. Data indicated that 4-6 days prior to the onset of clinical symptoms of BRD, greater positive and negative predictive values and test efficiency for infrared thermography (80%, 65% and 71%, respectively) compared to the industry standard practice of clinical scoring (70%, 45% and 55%, respectively). The use of infrared thermography as an early indicator of bovine respiratory disease complex in calves Infectious diseases such as bovine respiratory disease (BRD) are known to have a significant economic impact on the cattle industry with respect to treatment costs and a negative impact on animal performance and welfare (Basarab et al., 1997; Wittum et al., 1996) . Although the term BRD can refer to a host of complex diseases this term is generally used to refer to an animal displaying an undifferentiated fever as well as some number of clinical signs. As discussed by Jericho and Kozub (2004) the presence of BRD is common in intensively raised calves and the industry dependence on antibiotic treatment is high. Such management practices are becoming increasingly scruti-nized in many countries due to concern regarding antibiotic use and the potential promotion of antibiotic resistant microbes (Fleming, 1998; Jericho and Kozub, 2004) . This situation is in fact also becoming a growing issue in the popular press (Zimmerman and Zimmernan, 1996; Mellon et al., 2001) . The ability to treat bovine respiratory disease (BRD) especially in multiple sourced and co-mingled animals, is becoming more difficult. Shahriar et al. (2002) for example report the presence of antibiotic resistant pneumonia in feedlot cattle. As a result, a more targeted and selective use of antimicrobials in the animal industries is sought. As discussed by Cusack et al. (2003) the effectiveness of treating BRD depends primarily on the early recognition and treatment. Unfortunately, traditional clinical signs of BRD are known to often occur late into the course of the disease. By contrast, Humblet et al. (2004) demonstrated the utility of acute phase proteins, particularly haptoglobin and fibrinogen, as early indicators of bronchopneumonia in calves. However, measurements of acute phase proteins requires collection of biological samples and complicated laboratory-based analysis that cannot be performed in situ. Other early indicators of disease onset have been observed such as the actual appearance of antibodies in the blood, cytokines, acute phase proteins and fever. In a review of the behaviour of sick animals, Hart (1988) describes and references the observation that the acute phase fever response is initiated by endogenous pyrogens such as interleukin-1 following infection from many known microorganisms. The value or function of such a fever is thought to be in enhancing the animal's ability to combat infection. The mechanisms involved are shown to include the increase of neutrophils, an enhanced lymphocyte proliferation as well as an enhanced antibody synthesis. Recent research (Schaefer et al., 2004) has demonstrated the diagnostic potential of infrared thermography (IRT) under controlled conditions for an induction model of bovine viral diarrhea (BVD). Considering up to 60% of the heat loss from an animal can occur in the infrared range (Kleiber, 1975) the observation that radiated heat loss can serve as an early indicator of fever conditions is perhaps logical. In terms of treating animals displaying symptoms of BRD, the industry standard practice has been to use one or more antibiotics. In fact it is considered by some researchers to be difficult to even place and raise calves in feedlots without using antibiotics (Jericho and Kozub, 2004) . Again, this is a practice that is increasingly considered undesirable. The European Community's Scientific Steering Committee for example has recommended against the abundant use of antibiotics in the livestock industries (Jericho and Kozub, 2004) . The primary objective of the present study was to investigate the use of infrared thermography in the early identification of spontaneously occurring BRD. Perceived advantages of the infrared thermography technology were that diagnosis could be made in real time and could be conducted non-invasively. However, a comparative test of the IRT technology under spontaneous conditions was considered prudent. Research studies are reported on 133 multiple sourced, commingled commercial weaned calves. These studies were conducted at the Lacombe Research Centre beef research facility and all management practices followed Canadian Council of Animal Care guidelines (Canadian Council on Animal Care, 1993) and Canadian Beef Cattle Code of Practice guidelines (Agriculture Canada, 1991) . In addition, the research protocols were reviewed and approved by the Lacombe Research Centre animal care committee. The cattle demographics for the initial procurement of 171 animals involved; ninety three (93) These calves were chosen in order to provide study groups displaying a bovine respiratory disease (BRD) incidence range of 30-60% which is typical of the beef industry in Canada for these types of cattle. Data used in the present study for true positive and true negative animals is corrected for any health aberrations due to non-respiratory disease (example pink eye, foot rot) and for any data that could not be verified. For example, a calf received at the Lacombe beef research unit displaying keratoconjunctivitis (pink eye), pododermatitis (foot rot) or an infected wound of any other kind would be examined by the Research Centre veterinarian and placed on a treatment program. While it is recognized that such animals are also of interest and importance from a herd health perspective, these animals were also not ill due to BRD nor, for the purpose of a gold standard, was the time of the onset of their condition known. Thus, such animals would risk biasing the BRD population data and were not considered part of the BRD data set. This verification process left 133 calves in the study from an initial 171. For the cattle under study at the Lacombe Research Centre, the weaned calves were typically received in the afternoon having undergone transport for approximately 8 h, handling and co-mingling. These calves were offloaded into clean holding pens, given access to water and allowed to rest for a few hours prior to being processed. Processing consisted of measurements for haematology, clinical scores, salivary cortisol, infrared thermal measurements and body weights. Blood samples for haematology and serology assessment were collected by venous puncture of the jugular vein. For this purpose animals were briefly restrained in a conventional hydraulic cattle handling catch. Repeated measures were made on all animals at approximately weekly intervals. Animals were maintained on cereal grain silage rations which met or exceeded NRC (1984) recommendations. Any animals deemed ill either by direct veterinary assessment or under veterinary guidance were provided with routine antimicrobial therapy and/or other treatment and support until recovered. All calves were vaccinated for common BRD agents including clostridia, bovine viral diarrhoea and infectious bovine rhinitis at the end of the research study. The cattle in the present study were used to develop a ''gold standard'' for defining a true positive population of the presence of BRD. The approach of subjective ''judging ''who might be ill and who might be well, on the basis of pen checking, has been found wanting in the cattle industry. Hence, the use of a gold standard has been deemed useful in both human (American College, 1992) and animal clinical studies (Galen and Gambino, 1975; Thrusfield, 1995) . Several research groups have developed gold standards for the presence of BRD and there tends to be common aspects to all standards. In the current study we based our gold standards around parameters for normal and non-normal parameters as per the approach followed by Humblet et al. (2004) . In the current study, the criteria for a gold standard for true positive disease was defined as any animal displaying two or more of the following symptoms: A core temperature of 40°C or higher, a white blood cell count of less than 7 or greater than 11 · 1000 lL À1 , a clinical score of 3 or higher and a neutrophil/lymphocyte ratio of less than 0.1 or greater than 0.8. A true negative animal was defined as any animal displaying one or fewer of the above signs or symptoms. These parameters are consistent with ranges for normal or abnormal parameters suggested by several clinical texts (Blood et al., 1983; Kaneko, 1980) as well as historical data from our own lab. Using the above criteria, true positive and true negative populations of animals were established. A second objective of the study was to develop a predictive index for the infrared thermal data by establishing cut off values for thermal parameters useful in the early prediction of respiratory disease. This again would be possible once and only once true positive and true negative populations of animals could be established. In the present study, the methods described by Galen and Gambino (1975) and Humblet et al. (2004) were utilized, whereby, the positive predictive value (PPV) was defined as the proportion of all positive test results that were true positive. The negative predictive value (NPV) was the proportion of all negative test results that were true negative and test Efficiency was defined as the proportion of all test results that were true results, whether positive or negative. Data for true positive and true negative animals was used only for those animals in which the day of illness could be evaluated. For example, some commercial calves would arrive at the research facility already displaying signs of illness. Hence, for those animals an exact day for the onset of symptoms could not be established in terms of monitoring days prior to illness. For infrared scans (IRT) the data was representative of orbital maximum temperatures collected with a FLIR S60 camera (FLIR Comp. Boston, MA) from approximately 1 meter distance. Digitized infrared images were collected and saved in greyscale. Each greyscale pixel corresponded to a specific temperature. Pseudo-colours (Fig. 2) were added or generated by computer enhancement. An orbital image was obtained by auto tracing a circle over the orbital area including the eyeball and approximately 1 cm of surrounding skin of the eye socket. This area including the lachrymal gland is, in the author's observation, very sensitive to thermoregulatory changes associated with stress and disease conditions. Clinical score were based on those used in other studies and reported previously (Schaefer et al., 2004) . A clinical score was additive of the four components. The scoring system was as follows: Respiratory insult: (0-5): 0 = no insult, normal breath sounds (NBS); 1 = very fine crackle (rale) (VFCR) on auscultation and/or a moderate cough; 2 = fine crackle (subcrepitant) (FCR) on auscultation and/or a moderate nasal discharge and moderate cough; 3 = medium crackle (crepitant) (MCR) on auscultation and/or a moderate to severe viscous nasal discharge with cough; 4 = course crackles (CCR), tachypnea (>15% of the norm) and/or a severe discharge with respiratory distress and obtunded lung sounds; 5 = CCR with dyspnea, tachypnea, marked respiratory distress and/or lung consolidation. Digestive insult: (0-5): 0 = no insult, normal, eating and drinking; 1 = mild or slight diarrhoea with slight dehydration (<5%) and reduced eating; 2 = moderate diarrhoea with 10% dehydration and reduced feed intake (<50%); 3 = moderate to severe diarrhoea with 10% or less of feed intake and more than 10% dehydration; 4 = severe diarrhoea, and less than 10% of normal feed intake. 5 = severe diarrhoea and not eating, not drinking and dehydrated. Temperature score: core temperature (rectal) (0-5): 0 = <37.7; 1 = 37.7-38.2; 2 = 38.3-38.8°C; 3 = 38.9-39.4°C; 4 = 39.5-40.0°C; 5 = >40°C. Disposition or lethargy score: (0-5): 0 = no lethargy, normal posture; 1 = mild anorexia or listlessness, depressed appearance; 2 = moderate lethargy and depression, slow to rise, anorectic; 3 = recumbent or abnormal posture, largely depressed; 4 = prostrate, recumbent or abnormal posture; 5 = death. Salivary cortisol was measured by collecting saliva onto cotton swab and determining the cortisol levels in the saliva by an enzyme immunoassay modification of an existing method reported by Cook et al. (1997) . Hematology values were measured on a Cell-Dyn 700 Hematology Analyser (Sequoia -Turner Corp. Mountain View, CA). Differential blood cell counts were determined via stained blood smears (Geisma-Wright quick stain) and direct microscope examination of 100 cells. Animals displaying overt clinical symptoms of BRD in the present study received immediate treatment as recommended by the Lacombe Research Centre veterinarian followed by continued monitoring and re-treatment if required. To investigate the qualitative and quantitative presence of typical BRD viruses in this class of cattle a representative subgroup of 15 animals underwent serology assessment for five common viral causes of BRD. The selection criteria for these animals was as follows: the calves had to originate from a minimum of two farm herds or sources; the calves had to have been weaned within 48 h of auction sale; the calves had to have been recently transported on a commercial carrier (in the present case the transport time was approximately 8 h); the calves had to have been exposed to handling and managed through a commercial auction facility and finally, the calves had to have been exposed to a time off feed and water since weaning of between 24 and 48 h. The viruses assessed were Infectious bovine rhinotracheitis (IBR), Bovine virus diarrhea (BVD), Corona virus, PI3 (Bovine para-influenza) and BRSV (Bovine respiratory syncytial virus). Assessment of viral presence and titres was done by either ELISA methods (PI3, Corona and BRSV) or serum neutralization methods (IBR, BVD) and was performed by Prairie Diagnostic Services (Saskatoon, Saskatchewan). The titre values for these viruses are shown in Table 5 . For samples analysed by ELISA methods (BRSV, PI3 and Corona), antigen and tissue control well plates were used. Antibody concentrations are expressed by measuring the optical density as a percentage of a positive control serum whereby: (mean net optical density of test serum À mean net optical density of fetal bovine serum)/(mean net optical density of positive standard À mean net optical density of fetal bovine serum) · 100 = units. Calculations for positive predictive value (PPV), negative predictive value (NPV) and test efficiency were conducted as per the methods of Humblet et al. (2004) . Response Operant Characteristic curves (ROC, Fig. 1 ) were based on the methods described by Zweig and Campbell (1993) . MedCalc software (MedCalc, 2006) was used in the calculation of ROC (Fig. 1T ) curves and in developing the predictive values. ROC curves were used to calculate optimal efficiency cut off values as shown in Tables 3 and 4 . For infrared temperatures, the IRT absolute temperature was the direct temperature measured from the FLIR camera and calculated using the FLIR''researcher'' Ò software. The IRT mean ratios values represent the mean orbital maximum temperature value for an individual animal divided by the mean orbital maximum temperature value for the contemporary group of calves as they arrived at the research centre. In addition to the above, a statistical comparison of test performance for Tables 3 and 4 was conducted with Fishers exact test using MedCalc software (Galen and Gambino, 1975) . This study illustrates how one aspect of the thermal information, the orbital max temperature and the use of mean ratios have utility as early predictors of illness. Tables 1 and 2 illustrated how these predictors compare to other clinical standards commonly used in the animal industries both at the time a positive diagnosis is made with the calves or the day of illness and for a subset of animals measured 4-6 days prior to the positive identification of the disease. Of the sixty five animals meeting the criteria for a true positive diagnosis, it was possible to also collect prior data from 4-6 days earlier on thirty seven of these calves. Bovine respiratory disease predictive values, cut off values as determined by ROC curves and efficiency calculations were completed for all calves in the true positive and true negative populations. These calculations were made for both clinical (Table 3 ) and 4-6 days pre clinical (Table 4 ) time frames. Both absolute and mean ratio infrared values are displayed compared to several other commonly used predictors of disease in cattle including core temperature, white blood cell count, neutrophile/lymphocyte ratios, clinical scores and salivary cortisol values. Examples of response operant characteristic (ROC) curves are shown for both clinical scores and infrared absolute values calculated at both clinical and pre clinical times. For illustrative purposes, the progression and onset of bovine respiratory disease as detected by infrared thermography is shown in one animal (Fig. 2) . In terms of test comparisons, Fishers exact test results indicated that on the day of illness there were no significant differences in the ability of the tests to distinguish diseased animals (P > 0.05). In other words, when an animal was clinically ill then clinical scores, core temperature, haematology values or infrared thermography would suggest that animals was ill with equivalent accuracy. However, for data 4-6 days prior to the appearance of clinical illness, Fisher exact test scores indicated that infrared thermography was significantly better (P < 0.01) at identifying true positive animals than either clinical scores or core temperature. As illustrated by the serology data in Table 5 collected from a representative group of contemporary multiple sourced, co-mingled auction cattle, the calves used in the present study would appear to display significant titres to many of the viruses thought to be commonly responsible for BRD. While the presence of these viruses in serology samples would be expected as a normal profile especially in any calves that had been vaccinated, the high titres for many of the animals are suggestive of significant viral infectivity and were likely responsible for the onset of BRD in the calves. The identification of true positive and true negative populations of calves was attempted in the present study. The authors appreciate the fact that there are often debates regarding the optimal composition of parameters to define a gold standard. Predictably, some debate would likely also arise regarding the parameters and values used in the present study. However, as seen in Table 1 , the development of a gold standard set of values as used in the present study was nonetheless seen to be efficacious as a decision tool for BRD onset and in establishing true positive and true negative subsets in the calf population. This arguably is more accurate than simply trying to utilize a single subjective score or opinion for illness as is a common industry standard practice in such situations. The development of the gold standards for true positive and true negative criteria enabled an objective comparison of several common tests of disease presence. While none of the tests are perfect, as evident in Table 1 , when an animal is actually verifiably ill a number of tests including clinical scores, core temperatures and hematology will perform comparably well at identifying ill animals. Likewise, the predictive values of many tests (Table 3) will perform comparably well when the animals are actually ill. The challenge, however, as discussed earlier is how to develop and test predictive methods that are diagnostically relevant prior to the onset of the clinical signs of disease. In this respect, candidate methods such as infrared thermography are appealing both from the prospect of being non-invasive and, in contrast to laboratory based analysis, IRT technology can operate in real time. Apparent in the present study was the observation that infrared values, especially the mean ratio values, were as or more efficient than clinical scores, core temperatures or haematology in identifying ill animals prior to the clinical appearance of BRD. Of particular greater significance is that prior to the onset of clinical BRD infrared data was also seen to show predictive capability. Of interest was the observation that the predictive value of a comparatively non-invasive technique, salivary cortisol, likewise demonstrated a strong positive and negative predictive value as well as test efficiency compared to conventional measures such as core temperature and clinical scores. Table 4 illustrates how, with the use of a response operant characteristic curve calculation, such data could be used to optimise the prediction of positive and negative values for animals at risk of BRD and how different measurements perform. Data are used for 4-6 days prior to illness in a subset of calves subsequently determined as true positive using the gold standard values. a Please see methods section in text for description of clinical score system. b IRT = infrared thermography value for the orbital (eye) maximum temperature. c Probability determined by least squares analysis using Microsoft excel two tailed t-test. (2006) software. g,h The absolute IRT or infrared thermography value is equal to the orbital (eye) maximum value in°C, the mean ratio value is the mean infrared maximum temperature for the individual animal divided by the mean infrared maximum temperature value for the contemporary group of calves. Data are given for the day of illness or day clinical assessment was true positive using the ''gold standard'' values. a Please see methods section in text for description of clinical score system. b IRT = infrared thermography value for the orbital (eye) maximum temperature. c Probability determined by least squares analysis using Microsoft excel two tailed t-test. In addition to the above, the authors would, however, point out that the incidence of BRD in the animal populations used in the current study was comparatively high. A valid question therefore would remain regarding how the infrared thermography technology might perform as an early indicator of disease in animal populations displaying a lower incidence of BRD. One must also keep in mind that as discussed by Galen and Gambino (1975) the use of parallel and/or series testing can be used to increase all measures of diagnostic test per-formance. In the case of infrared values just as an example, using both infrared absolute values and infrared mean ratio values in a parallel test would improve the negative prediction values and test efficiency by 5-10%. There are many additional ways infrared data can be utilized and only two such possibilities (absolute and mean ratio data) are presented in the present manuscript. Optimising the use of such values will be the next step in this research. Furthermore, the use of infrared thermography lends itself to a non-invasive automation of data collection. In concurrent research (2006) software. g,h The absolute IRT or infrared thermography value is equal to the orbital (eye) maximum value in°C, the mean ratio value is the mean infrared maximum temperature for the individual animal divided by the mean infrared maximum temperature value for the contemporary group of calves. Fig. 2 . Retrospective facial infrared thermography images of the same animal during the onset of bovine respiratory disease. Clinical signs were positive (>3) on days 9-10. (Stewart et al., 2005) , individual animal identification using radio frequency identification tags (RFID) and an IRT camera located at a water station have been employed to automatically collect thermal data. This situation is impossible to achieve with more invasive measures such as haematology or core temperatures. Of interest from the point of speculation would be the question that if an early disease detection system, perhaps employing an infrared parameter, were possible, what then would be the optimal way to utilize such data? Animals identified early might be re-vaccinated, or simply isolated from the contemporary group to avoid viral shedding and infecting other individuals. In this regard, recent trials at Lacombe Research Centre have demonstrated that animals identified early as at risk of developing BRD benefited from the prophylactic application of respired nitric oxide (Schaefer et al., 2006) . As discussed by McMullin et al. (2005) and Ghaffari et al. (2006) nitric oxide is a known microcidal agent in vitro and thus may well function effectively in vivo also. Data collected in the present study demonstrated that infrared thermography scans of the orbital area in calves was efficacious as an early identifier of bovine respiratory disease onset. Such information would enable the earlier and more targeted treatment of affected animals thereby reducing animal suffering, improving the animal industry economics and reducing the likelihood of promoting antibiotic resistant microbes.

Annnotations TAB TSV DIC JSON TextAE

  • Denotations: 3
  • Blocks: 0
  • Relations: 0