ISSN: 2637-4676
Ziya Altas1, Mehmet Metin Ozguven1* and Yusuf Yanar2
Received: November 10, 2018; Published: November 27, 2018
Corresponding author: Mehmet Metin Özgüven, Department of Biosystems Engineering, Turkey
DOI: 10.32474/CIACR.2018.05.000214
This study was conducted to determine the cercospora leaf spot disease level in the local sugar beets field in Tokat province using the image processing algorithms and to check the maching between the leaf spot disease assessment done by using the image processing algorithms and visual assessment done by expert using disease severity scale. For this purpose, 12 images showing different levels of development of the disease, taken at different times and different natural lighting conditions from the field have been determined by image processing technique using Image Processing Toolbox module of MATLAB program. As a result of the study, the disease severity results acquired; a: 100%, b: 48%, c: 42%, d: 21%, e: 80%, f: 28%, g: 74%, h: 47%, i: 29%, j: 46%, k: 20%, m: 51% with observation results; a: 100%, b: 50%, c: 45%, d: 20%, e: 80% , f: 30%, g: 75%, h: 50%, i: 30%, j: 50%, k: 20% m: 50% have been compared. These values are very close indicates that the study was successfully carried out. In addition, it has been determined that the results of the study using image processing techniques give precise and accurate value that can not be determined by observation.
Keywords: Drones; İmage Processing; Sugar Beet; Leaf Spot Disease; Disease Detection
The technical processes emerged in line with the technological advances contribute to the economical, sustainable and productive industry, which are the goals of plant and animal production. Image processing techniques have become an important tool in facilitating agricultural operations and in bringing alternative solutions to the problems that need to be solved or improved. Thanks to the developed algorithms and software, numerous studies have been carried out by researchers on disease, harmful and weed detection, plant identification and detection, determination of plant stresses, yield estimation, determination of obstacles, determination of distances between the rows and row-tops, classification of soil and land cover, estimation of botanical composition, evaluation of vegetation indexes, green area index, determination of plant growth variability, follow-up of product development, followup of root development, modeling of irrigation management practices, determination of soil moisture in plant production, and monitoring of animal development in a herd, movement skill scoring, measurement of body characteristics, determination of body condition score, monitoring body weight, lameness detection, determination of pain locations, body temperature monitoring, location determination in animal production. Examples of these studies are shown in Table 1. the development of real-time and automated expert systems, autonomous tractors or agricultural machines and agricultural robotics applications have been realized, by applying the experience gained during the implementation of image processing techniques in agriculture together with the machine learning, deep learning, artificial intelligence, modeling and simulation applications (Table 2). For this reason, image processing techniques will continue to be one of the most important agricultural research topics in the present and future.
Increasing the quality and productivity of crops in agricultural activities depends on the monitoring the growing plants well and carrying out the necessary operations at the right time. Drone systems, which have a simple technical structure and are easy to use, offer farmers an opportunity to make plans in agricultural activities using their embedded sensors and cameras, providing high quality and 3D images [1]. Image processing techniques are used to extract information from a moving or fixed image captured by a camera or scanner, in digital format, using a number of algorithms. MATLAB and C++ programs are widely used analysis programs today. Color and shape analysis can easily be performed in real-time on digitized objects through these programs [2]. Sugar beet is the most important plant species used in sugar production in the world after sugar cane, and it is a two-year, summer cultivar. In the first year, it creates its root stem under the soil, allowing the sugar yield. In the second year, the plant grows its organs above the earth to form seeds [3]. Sugar beet production contributes to the development of plant and animal production, improves the physical structure of the soil, improves ecological balance, and maximizes the yield of the products to be planted after. For this reason, early detection of diseases and pests in sugar beet farming and the prevention of loss of yield by carrying out the required agricultural pest control are very important [4]. Sugar beet leaf spot disease (Cercospora beticola Sacc.) is one of the most significant, common and harmful fungal diseases affecting sugar beet. Individual leaf spots are almost circular and are 3-5 mm in diameter at maturity. The lesion changes from light brown to dark brown with reddishpurple boundaries depending on the anthocyanin production of the leaf. As the disease progresses, individual spots merge and the severely infected tissue first becomes yellow and then becomes brown and necrotic. Healthy leaves stay green and are less affected or do not contain lesions [5].
Table 2: Examples of studies conducted using the image processing together with different techniques.
Cercospora beticola Sacc. damages the plant by harming the leaves (Figure 1). The number of spots seen in very small numbers and small rounds initially, increases rapidly and covers the whole leaf surface. As a result, the leaf dries completely and dies. The disease, starting from the outermost leaves of the beet, develops from the outside to the inside, drying all the leaves while the beet constantly blooms new leaves from the body to maintain its vital activities. As a result of this, beet consumes its energy constantly to form new leaves, leading to insufficient polar and beet root growth [6] Cercospora leaf spot is one of the most important diseases of sugar beet both in Turkey and in the world [7-9]. Depending on the severity of the disease in the field, it can cause a loss in sugar yield by 10% to 50% [8,10]. In Turkey, the disease reduces the tuber yield by 6-35% in sugar beet, and sugar yield by 1-26% [11]. In addition, the disease increases the rates of potassium (6%), sodium (25%) and alpha-amino nitrogen (40%), which affect the sugar production from sugar beet negatively [9]. The control of the disease is achieved through integrated pest management programs that include use of tolerant varieties, alternation and fungicide applications [12]. In order for an effective integrated pest management program to be implemented, it is vital to timely and correctly identify the outbreak of the disease, its severity, and its progress in the field. In addition, rapid and accurate identification of disease severity will greatly help reducing the product loss [13].
Traditionally, disease severity in plants is determined visually by experienced specialists using different scales according to type of the disease. In determining the severity of Cercospora leaf spot disease, the 1-9 [14], 1-5 visual scale of Vereijssen et al. [15] or Schmittgen [16] is used. Then, by using Townsend-Heuberger [17] formula, the percentage disease severity is calculated using the values obtained from these scales. These methods are costly and time-consuming methods that require labor, and they can cause mistakes and errors depending on the level of experience of the specialist in plants produced in large areas such as sugar beets and wheatgrass [18]. In order to eliminate these problems, there is a need for faster and practical methods that reduce human errors in the identification of plant diseases, disease severity and progress of the disease, especially in large production areas. In this study, it was aimed to determine the severity of leaf spot disease by using image processing techniques with various algorithms developed, using drone in the field conditions of sugar beet produced in large areas.The study was carried out in a sugar beet field, in the vicinity Tokat-Amasya highway 74th Branch of the Highways Authority, under natural infection conditions. The study was carried out on an area of approximately 200m2, having various levels of disease severity.
The drone used in the study is the DJI Phantom 3 Advanced brand, which features a 12-megapixel camera with up to 1080P video recording at 60FPS. In addition, the camera has a 94-degree viewing angle and f/2.8 lens. It can send real-time images in 720p HD format to a smartphone or tablet within approximately 2 km range. There is a GPS positioning system on the drone and it is able to determine its position by scanning the ground level with the aid of ultrasonic sensors. With the autopilot feature, it is able to start its engines fly at a preset altitude. When GPS is active, it can come back to the position it took off with the return button. When the battery is low or when the connection with control is disconnected for any reason, its Failsafe feature returns the drone back to the take-off position for safe landing [19].
MATLAB version R2014a program was used for image processing. MATLAB is a numerical computation and fourth generation programming language. MATLAB allows the processing of matrices, drawing functions and data, applying algorithms, creating a user interface, and interacting with programs written in other languages, including C, C ++, Java, and Fortran languages. The power that MATLAB brings to the field of digital image processing is the broad set of functions for processing multidimensional arrays of image features. Image processing was performed using the Image Processing Toolbox module of the MATLAB program. Image Processing Toolbox is a collection of functions that extend the capacity of the MATLAB digital computing environment. These functions and the expressive power of the MATLAB language make it easy to write image processing operations compactly and clearly, thus providing an ideal software prototyping environment for solving image processing problems.
Leaf spot disease in sugar beet is determined by local observations made by plant protection experts. Local disease assessments were made by the expert according to the 0-9 scale given in Table 3 [14] and the visual scale given in Figure 2. Images were taken from a height of 30-60 cm using the camera system installed on the drone. Then, disease severity was calculated according to the Townsend-Heuberger formula using the disease severity values obtained by the terrestrial observations [17,20]. The equation used for the calculation of disease severity percentage is: “Disease severity (%) = Σ (n x V/Z x N) x 100”. Where, n: number of plants with different disease severity in the scale, V: scale value, Z: the highest scale value, N: observed total number of plants”. The images taken with the drone were processed with the developed image processing algorithms and disease rates at leaf and plant level were calculated [21-29]. As a result, the compatibility between the two methods was determined by comparing the values of disease severity obtained by image processing with the terrestrial values in the direction of the goals of the study [30-41].
Sugar beet is a summer season culture plant grown in two years. It is planted at the end of March and harvested in early October. In climatic conditions of Tokat, the first infections of the disease appear in May-June, and the disease continues to damage plants throughout the entire production season. The disease is most intense in August [42-48]. For this reason, images were taken between August and September 2016, when the disease was most intense. Images were obtained using Drone under natural lighting conditions. The first image was taken on August 10, 2016, and a total of images were taken every 15 days until the end of September. of the images taken at 4000x2250 pixel resolution under natural lighting conditions on the August 7, August 25 and 7th of September in the study, 12 images showing different development levels of the disease in sampling area were processed [49-52]. Since all the leaves were dead, images taken on 22nd of September were not subjected to image processing. Preliminary studies have shown that sunlight and shadows adversely affect plant images taken under natural lighting conditions in field, if the drone’s flying height exceeds 60 cm images. For this reason, images were taken at distances of about 30-60 cm since the plant lengths and plant-drone distances differ as well.
Leaf image disease segmentation, which is the most important process in diagnosing the disease, tries to classify pixels into K classes according to a series of features by using typical algorithms such as K-means clustering [53,54]. In this study, K-means clustering algorithm was used to identify disease using the diseased leaf images. Algorithm steps are given below.
a) Step 1: Data entry; sugar beet leaf images taken from the field are entered into the program. The leaf images are RGB images in JPEG format as shown in Figure 3.
b) Step 2: Each image is converted from RGB color space to L*a*b* color space since the color space to be worked limits the image distortions caused by brightness while working with color. The disease information in L*a*b* color space is stored in only two channels (a* and b* components).
c) Step 3: Classification; The colors in the “a*b*” area is classified using K-means clustering. Pixels in the diseased image (colors carried with “a*” and “b*” values) clustered with K-means using euclidean distance.
d) Step 4: Pixel tagging; Each pixel in the image is tagged using the results from the K-means clustering. For each pixel in the input, the K-means returns an index that corresponds to a cluster. As shown in Figure 4, each pixel in the image was tagged with cluster index.
e) Step 5: Separating the diseased leaf image according to the color, using pixel tags, the pixels in the image are separated by color, resulting in three images (i.e., K = 3) as shown in Figure 5.
Figure 5: K-means cluster segmentation of diseased leaf image: (A) black segment; (B) green segment; (C) brown segment (disease image).
f) Step 6: The diseased image is selected among three clusters.
Contrast enhancement of color images is typically carried out by transforming intensity component of a color image [55-60]. Such a color space is L*a*b*. Color conversion functions were used to convert the image from RGB to L*a*b* color space, and then worked on the brightness (L*) layer of the image. The brightness layer is replaced by the processed data, and then the image is converted into the RGB color space (Figure 6). Brightness alignment affects the density of the pixels while preserving the original colors.
The diseased areas on the leaf is calculated by dividing the number of pixels constituting those areas by the total number of pixels forming the leaf [61-65], hence its proportion on the leaf indicates the severity of the disease. Let B (x,y) expresses the value at row x and column y of an image having m rows and n columns. Then, area of the kth object can be found by the following equations.
Ak = area of the object on the picture (diseased area)
B(x,y) = value at given row and column of the image recognized (xth row, yth column)
Disease severity (%) = Ak / Total Area (3)
Within the scope of the specified method [66-69], 12 images showing different levels of development of the disease in the sampling area were processed using the developed algorithms, and the image processing results obtained are given below. The original images in JPEG format are shown in Figures 7-14 with (1), and the pixel tagging results on colors classified by K-means clustering are shown with (2). After the clustering process, the images were separated into color components: (3) showing the green component of the plant [70-73], (4) showing the brown parts, indicating the disease. Then, the number 5 shows the contrast enhanced images, processed with algorithms developed to make the green parts and diseased parts of the plant more apparent. The spatial image processing results are shown from h to m, where 1 shows the original image, 2 shows the pixel labeling image after clustering, 3 shows the brown [74,75], diseased parts among the color separated images, and 4 shows the contrast enhanced images of the brown parts.
In the study, the Image Processing Toolbox module of the MATLAB program was used to see whether there is a sugar beet leaf spot disease (Cercospora Beticola Sacc), and disease severity in 12 images showing different development levels of the disease in the sampling area. The results obtained are presented in Table 4. When Table 4 is examined, it is seen that the study results obtained using the drone system and image processing technique and the results obtained by observation are very close to each other. These quite close results indicate the success of the study. In addition, it was determined that the evaluation results obtained by observation are approximate, integer values and that the results of the study obtained using image processing techniques give the precise value of diseased area with the sensitivity that cannot be achieved by observation.
In this study, the fact that the image processing techniques applied on the images taken by the camera attached to the drone identified the diseased areas precisely suggests that it can be used for disease identification as an alternative to the observation method, and that may be preferable since it gives exact values. This study puts forth that labor and time losses can be prevented and diseases on the plants can be effectively controlled throughout the whole season thanks to the success of the method and ease of observation through drones in the field for disease identification, by taking images without disturbing the field. Thus, yield losses due to disease in sugar beet, which contributes greatly to the country economy, will be prevented through necessary pest management carried out timely.
It is believed that the study will contribute significantly to agricultural activities with its unique design, conducted in the field under natural lighting conditions, instead of a closed laboratory environment. The successful identification of leaf spot disease in sugar beets in the study suggests that the method can be used in applications such as plant growth monitoring, yield determination, disease and pest detection in different plant species. Since it has been shown that sunlight and shadows adversely affect plant images taken under natural lighting conditions in field, if the drone’s flying height exceeds 60 cm images, images in the study were taken at distances of about 30-60 cm since the plant lengths and plant-drone distances differ as well. These heights determined in the study are believed to be helpful for the researchers who will conduct similar studies in the future.
Bio chemistry
University of Texas Medical Branch, USADepartment of Criminal Justice
Liberty University, USADepartment of Psychiatry
University of Kentucky, USADepartment of Medicine
Gally International Biomedical Research & Consulting LLC, USADepartment of Urbanisation and Agricultural
Montreal university, USAOral & Maxillofacial Pathology
New York University, USAGastroenterology and Hepatology
University of Alabama, UKDepartment of Medicine
Universities of Bradford, UKOncology
Circulogene Theranostics, EnglandRadiation Chemistry
National University of Mexico, USAAnalytical Chemistry
Wentworth Institute of Technology, USAMinimally Invasive Surgery
Mercer University school of Medicine, USAPediatric Dentistry
University of Athens , GreeceThe annual scholar awards from Lupine Publishers honor a selected number Read More...