时间:2024-05-24
朱永宁,周 望,杨 洋,李剑萍,李万春,金红伟,房 峰
基于Faster R-CNN的枸杞开花期与果实成熟期识别技术*
朱永宁1,3,4,周 望2**,杨 洋1,3,4,李剑萍1,3,4,李万春1,3,4,金红伟2,房 峰2
(1.中国气象局旱区特色农业气象灾害监测预警与风险管理重点实验室,银川 750002;2.航天新气象科技有限公司,无锡 214000;3.宁夏气象防灾减灾重点实验室,银川 750002;4.宁夏气象科学研究所,银川 750002)
以宁夏16套枸杞农田实景监测系统2018年和2019年拍摄的图像作为资料,结合枸杞开花期和果实成熟期的植物学特征,利用更快速的基于区域的卷积神经网络(Faster R-CNN)方法对图像进行训练、分类,构建枸杞开花期和果实成熟期的识别算法,以平均精确率(AP)和平均精度均值(mAP)作为模型的评价指标,并将自动识别结果与专家目视判断结果和田间观测记录进行对比。结果表明:当网络结构中重要超参数批尺寸(batch size)和迭代次数(iterations)分别取值64和20000时,mAP值达到0.74,在测试集上对花和果实的识别效果好于其它参数。基于Faster R-CNN判识的枸杞开花期和果实成熟期与专家目视判断的差异在2~5d,这两种方法的判断对象和判断标准一致,可比性强,专家目视判断的结果可以作为自动识别技术的验证标准,用来优化并调整算法。自动识别结果与同期田间观测记录的差异在0~12d,差异的主要原因是这两种方法的判识对象和标准不一致,难以利用田间观测的结果优化自动识别算法。
枸杞;开花期识别;果实成熟期识别;发育期识别;Faster R-CNN;图像识别
作物观测是农业气象观测的重要组成部分,主要包括发育期、生长状况、产量结构以及病虫害等,发育期作为诸多农业气象指标的分界线[1],是气象为农服务的基础信息。根据现行的农业气象观测规范,要求对作物环境的物理要素(气象要素、田间土壤湿度等)和作物要素(发育期、生长状况、产量等)进行平行观测[2]。气象要素观测从2020年4月即实现了全面自动化,目前土壤水分的监测也有了自动土壤水分站,但作物要素的观测仍然依靠人工和简单的仪器进行实地测量[3],长期以来农业气象观测耗费人力、时效性不足,另外要求具有丰富经验的专业人员,难以普及[4],现行的观测方法已不能满足业务服务发展的需求。
近年来随着物联网、计算机硬件尤其是图形处理器(GPU)的出现以及深度学习等技术的发展,基于图像的作物分类、作物发育期以及病虫害识别等均取得了不少研究成果。李涛等利用卷积神经网络(CNN)进行训练,对玉米雄穗进行识别进而对玉米抽雄期进行判识,其识别率、精确度和召回率分别达到了99.42%、99.53%和99.37%[5],陆明等分别用RGB和HSL颜色空间提取绿色和黄色像素占整幅图像的比例对夏玉米发育期进行判定,识别正确率达到了94.24%,其中播种期、出苗期、三叶期和七叶期的识别正确率达到100%[6]。刘阗宇等通过Faster R-CNN方法准确定位图像中的葡萄叶片,提出了一种基于卷积神经网络的病害检测算法,对6种常见葡萄病害的平均精度值达到66.47%,其中褐斑病与白粉病的平均精度值超过70%[7]。刘永娟利用计算机视觉技术对玉米发育期识别进行了研究,其结果与人工观测结果误差在2d以内[8]。熊俊涛等利用Mask R-CNN模型对大豆叶片生长期叶片缺素症状的检测方法进行了研究,训练的模型在测试机上的分类准确率为89.42%[9]。张博等将空间金字塔池化与改进的YOLOv3深度卷积神经网络相结合,提出了一种农作物害虫种类识别算法,该算法在实际场景下对20类害虫进行了识别测试,识别精度均值达到88.07%[10]。多位学者的研究表明,图像识别技术在作物发育期、病虫害等方面的识别均有良好的应用前景和可行性[11-12],其优势在于省时、高效且能克服人工观测的主观性。
由于卷积神经网络(CNN)具有局部连接,权重共享以及汇聚的特性,这些特性使其具有一定程度上的平移、缩放和旋转不变形,在图像和视频分析任务上表现突出[13]。近年来发展出了包括R-CNN、SPP-Net、Fast R-CNN、Faster R-CNN、YOLO以及SSD算法等,其中YOLO和SSD作为一阶检测器,其效率更高,在实时目标检测方面具有更强的适用性。而R-CNN、Fast R-CNN以及Faster R-CNN是基于候选区域的卷积神经网络,其检测性能更优,在公开基准上取得了更好的结果。Faster R-CNN模型由区域建议网络和Fast R-CNN结合而成,用区域建议网络代替选择性搜索算法,解决了技术区域建议时间开销大的瓶颈问题,在识别速度和精度上都进一步提高[14],被广泛用于作物特征识别[15-19]、杂草识别[20-22]以及遥感、医学等多个领域的研究[23-24]。
2018年,在宁夏农业气象服务体系和农村气象灾害防御体系项目的支持下,建成了16套枸杞农田实景监测系统。为了充分利用图像资料,研究枸杞发育期的自动识别算法,逐步实现发育期的自动观测,本研究探索了枸杞开花期和果实成熟期的识别技术,由于发育期的观测无需做到实时检测,只需要对图像中的目标特征进行高精度的检测,因此选择Faster R-CNN技术。
枸杞图像资料来自2018-2019年宁夏16套枸杞农田小气候站上的实景监测系统,该系统采用高清摄像机DH-SD-6A9630U,摄像头有效像素为500万,可以实现360°水平旋转、0~90°垂直旋转和变焦拍摄。每套监测系统于枸杞生长季(4月1日-11月15日)每天拍摄10张图像,图像拍摄高度为6m,图像分辨率为2560×1920,图像文件为24位RGB真彩色JPG格式。拍摄到枸杞开花期和果实成熟期特征图像共3000余张,去除镜头污损、拍摄视场角度不理想的图像,最后剩余图像样本数为1210张。为避免参与训练的某类别图像数目过少或过多而出现欠拟合或过拟合现象,采用旋转、裁剪、翻转的方法进行数据增强,保证枸杞开花期和果实成熟期两个类别的图像样本数量均衡,最后得到试验样本共7260张,其中训练集图像5808张,测试集图像1452张。将数据增强后的样本按照PASCAL VOC2007数据集格式进行划分,图像分辨率为2560×1920,图像文件为24位RGB真彩色JPG格式。根据枸杞开花期和果实成熟期具有的显著图像特征,利用labelImg标签工具将所有图像样本中的花和果实作为标签对象,共标出12100朵“花”的标签和11602个“果实”标签(图1)。
摄像头控制。在摄像头的有效拍摄范围内选定两个区域,每个区域选定连续排列的5棵枸杞树,设定摄像头拍摄角度,每次拍摄时都按照设定的角度对每棵枸杞树树冠进行拍摄,每天拍摄1次,每次拍摄10张图像。
开花期和果实成熟期资料的获取采用3种方法。
(1)田间观测法。参照《农业气象观测规范枸杞》[25],2019年在银川枸杞研究所观测站(Y0200)和中宁石喇叭村观测站(Y0211)于枸杞生长季同步进行田间观测。田间观测选择与摄像头拍摄相同的10棵枸杞树,在每个观测植株上选定2个枝条作为观测枝条。当观测枝条上出现某一发育期的特征时,即认为该枝条进入此发育期,地段内枸杞群体进入发育期的时间,按照观测的总枝条数中进入发育期的枝条数所占的百分率确定,≥50%即为进入普遍期。枸杞的发育期中包含开花与果实成熟特征的主要发育期有6个,即老眼枝开花期(老眼枝上有花开放)、老眼枝果实成熟期(老眼枝上的青果迅速膨大,变成鲜红色,有光泽)、夏果枝开花期(夏果枝上有花开放)、夏果成熟期(夏果枝上的青果迅速膨大,变成鲜红色,有光泽)、秋梢开花期(秋果枝上有花开放)以及秋果成熟期(秋果枝上的青果变红)。
图1 枸杞“花”和“果实”的特征标签
(2)专家目视判断法。选取Y0200和Y0211两个站点的全部图像,由5名经验丰富的专家对图像进行目视判断,判断标准是一张图像中出现某一发育期的特征达到5个,则认为这一棵枸杞树达到了这一发育期的普遍期,某一天的10张图像中有5张达到普遍期,则认为地段内的枸杞群体进入该发育期,综合各位专家的意见给出专家目视判断的结果。
(3)自动判识法。利用Faster R-CNN对训练集图像进行训练,构建枸杞开花期和果实成熟期自动识别算法,根据算法对图像中开花和果实成熟的特征进行标注。该方法的仿真实验平台为GPU服务器,处理器为Intel Broadwell E5-2650 v4,主频2.2GHz,128GB内存,4TB硬盘,GPU采用NVIDIA Titan XP,运行环境为Ubuntu 16.04.9,Python 2.7,数学内核库MKL 2017版,CUDA 8.0与cuDNN 8.0深层神经网络库,深度学习框架采用Caffe。
根据观测经验,枸杞树冠一般修剪为两层,每层平均10个枝条,摄像头由于俯拍的原因能够拍摄到第一层的10个枝条。按照观测总枝条的50%出现某发育期的特征作为进入普遍期的原则,规定当一张图像中出现5个特征点即认定这幅图像拍摄的这棵枸杞树进入了普遍期。拍摄的10幅图像中,有5幅图像达到普遍期即算作观测的田块达到了普遍期。
由于枸杞的发育期中,具有开花特征和果实成熟特征的各有3个发育期,利用Faster R-CNN构建的开花期和果实成熟期识别算法,只能对特征进行识别,但无法识别出是属于老眼枝的花(或果实)还是春梢或者秋梢的花(或果实)。根据枸杞发育特征,在判断不同阶段的开花或果实成熟时引入时间序列判断。从有图像开始逐日判识,将第一次判识出开花普遍期的时间认定为老眼枝开花普遍期,将10张图像中开花特征为0的时间作为第一个节点,再往后判断出现开花普遍期时,认定为夏果枝开花普遍期,至10张图像中开花特征为0的时间作为第二个节点,再往后判断出现开花普遍期时,则作为秋梢开花普遍期。同样的方式用以判断果实成熟期的日期。
2.1.1 Faster R-CNN整体流程
Faster R-CNN主要由RPN网络和Fast R-CNN目标检测组成[26],VGG16网络用于提取候选图像的特征图,RPN网络用于生成区域候选框。Fast R-CNN基于RPN提取的候选框检测并识别候选区域中的目标。Faster R-CNN的整体流程共有4个环节(图2)。
图2 Faster R-CNN的目标检测结构
(1)特征提取:Faster R-CNN首先使用VGG16网络提取候选图像的特征图,该特征图被共享用于后续RPN层和全连接层。
(2)RPN网络:RPN网络用于生成候选区域框。该层通过判断锚点属于前景或者背景,再利用边界框回归修正锚框获得精确的候选框。
(3)ROI池化:该层收集输入的特征图和候选的目标区域,综合这些信息后提取目标区域的特征图,送入后续全连接层判定目标类别。
(4)目标分类和回归:利用目标区域特征图计算目标区域的类别,同时再次利用边界框回归获得检测框最终的精确位置。
2.1.2 VGG16网络模型
特征图的提取对最后的结果准确与否至关重要,VGG16的卷积层和池化层均采用相同的卷积核参数和池化核参数,模型由若干卷积层和池化层堆叠的方式构成,比较容易形成较深的网络结构,具有很强的特征提取能力[27]。该网络具有13个卷积层、13个激励层和4个池化层(图3)。其中卷积操作的步长为1,边界填充为1,卷积核宽、高为3×3,既保证了卷积前后图像宽高不变,又可以在提升网络深度的同时避免权重参数过多。池化层采用2×2且步长为2的最大池化,池化层不影响图像的通道数目,但每次池化过后图像的宽高都将减半。卷积的通道数有64、128、256、512个等级别,通道数量表示图像经卷积提取特征后的特征图数量。每个卷积层之后用ReLu激活函数进行非线性变换,此操作不影响特征的宽高及通道数目。通过参数设置,输入图像经过13层卷积和4层池化后得到的输出特征图的宽高变为原图像的1/16,通道数目由RGB三通道变为512。
图3 VGG16网络结构
2.1.3 试验指标评价
为了客观评价训练模型的优劣,选用平均精确率(AP)和平均精度均值(mAP)作为模型性能的评价指标。AP是P-R曲线下面的面积,P为精确率,R为召回率。AP是针对单个类别,AP值越高分类器分类效果越好。mAP是多个类别AP的平均值,mAP的取值范围为[0,1],mAP越大,说明训练出来的识别模型目标检测效果越好。P和R的计算式为
式中,TP为被正确划分为正样本的数量,FP为被错误划分为正样本的数量,FN为被错误划分为负样本的数量。
2.1.4 网络训练
在数据集固定的情况下,深度学习模型的最终效果取决于超参数调节的好坏。超参数优化是一个组合优化问题,无法像一般参数通过自我学习不断调整,需要进行人工或优化算法进行设置。使用网格搜索的方式来调整优化超参数。为了保证实际环境中拍摄的图像与训练图像样本保持一致性,在训练数据时,对前80% epoch进行所有数据的训练,后面20% epoch只训练原始数据。在学习率固定的情况下,选取重要超参数batch size和iterations按照网格搜索方式寻找最佳参数组合。其中,学习率为0.001,batch size取值范围为[32,64],迭代次数为[10000,20000,30000]。测试时,将枸杞开花和果实成熟特征的提取和分类加载到训练好的神经网络结构中,将分辨率为2560×1920的24位RGB真彩色的枸杞生长图像输入进行卷积运算,进行端到端处理,利用评价指标对不同参数测试结果进行评价,结果见表1。
由表1可知,基于本研究采用的试验仿真平台,在使用VGG16作为特征提取网络的基础上,当学习率为0.001,网络重要超参数batch size和iterations分别取值64和20000时,在测试集上对花和果实的识别效果好于其它参数。
表1 不同超参数组合在测试集上的测试结果
注:AP 是平均精确率,mAP是平均精度均值。
Note: AP is Average Precision,mAP is mean Average Precision.
对比2019年Faster R-CNN自动识别的枸杞开花和果实成熟普遍期结果与田间观测结果(表2),由表2可知,Y0200站自动识别结果比田间观测结果整体偏晚2~11d,差异最大的是夏果枝开花期,差异最小的为老眼枝开花期和秋果成熟期。而Y0211站的情况正好相反,自动识别的结果比田间观测记录结果整体偏早0~12d,差异最大的是夏果枝开花期,差异最小的为老眼枝果实成熟期。可见,基于Faster R-CNN判识的结果与田间观测结果在不同发育期和不同站点表现不一致。分析导致差异的原因,一是判识对象不一致,田间观测时不仅明确了每棵枸杞树,同时明确了每个枝条,观测不会因遮挡、隐藏影响观测结果。自动识别算法的判识对象是图像,由于是二维信息无法解决遮挡的问题,当出现的特征被遮挡、隐藏无法拍摄到时,自动判识结果比田间观测结果偏晚。二是判识标准不一致,田间观测以出现发育期特征的枝条数占观测总枝条数的百分比作为标准,图像识别中以一副图像中出现5个特征点为标准,两种方式的可对比性不强。
表2 2019年枸杞开花期和果实成熟普遍期自动识别结果与田间观测结果对比
注:Y0200为银川枸杞研究所观测站,Y0211为中宁石喇叭村观测站。P1、P2、P3、P4、P5和P6分别为老眼枝开花期、老眼枝果实成熟期、夏果枝开花期、夏果成熟期、秋梢开花期和秋果成熟期。“-”表示没有得到对应的发育期日期。下同。
Note: Y0200 is the code of YinchuanResearch Institute Observation Station, Y0211 is the code of Zhongning Shilabacun Observation Station. P1,P2,P3,P4,P5 and P6 are flowering period on the first fruit bearing shoot, fruit maturity on the first fruit bearing shoot, flowering period on the summer fruit bearing shoot, fruit maturity on the summer fruit bearing shoot, flowering period on the autumn fruit bearing shoot, fruit maturity on the autumn fruit bearing shoot, respectively. - means that the corresponding developmental date has not been obtained.The same as below.
由于2019年9月的连阴、降雨天气导致Y0211站花蕾严重受损,后期未观测到秋梢开花普遍期和秋果成熟普遍期,自动识别技术也未识别到这两个发育期,两种方法结果一致。
对比2019年Faster R-CNN自动识别的枸杞开花期和果实成熟普遍期结果与专家目视判断结果(表3),由表3可知,Y0200站自动识别结果与田间观测结果整体相差2~4d,差异最大的是夏果枝开花期,差异最小的是老眼枝开花期和秋果成熟期。Y0211站自动识别结果与专家目视判断结果相差3~5d。Y0211站由于天气原因未出现秋梢开花普遍期和秋果成熟普遍期,自动识别技术也未识别到这两个发育期,结果一致。整体上看,自动识别结果与专家目视判断的结果差异明显缩小,主要原因首先是观测对象一致,两种方法均从图像中获取信息,如果完全遮挡,自动识别和专家目视均无法判断,另外,两种方法的观测标准一致。二者依然存在差异的原因是图像中开花和果实成熟特征部分被遮挡,专家认为是开花或果实成熟特征,但自动识别会遗漏,这一点专家的判断结果明显好于自动识别(图4),自动识别算法在这一点上还需完善。图4a中,专家目视判断出的开花特征共6个,自动识别结果为4个,图4b中,专家目视判断出的果实成熟特征共5个,自动识别结果为3个。造成差异的原因都是特征被树叶或枝条遮挡,自动识别算法未提取到,可见利用VGG16提取的特征还存在缺陷,需要进一步完善。
表3 2019年枸杞开花期和果实成熟普遍期自动识别结果与专家目视判断结果对比
图4 Faster R-CNN自动识别与专家目视判断开花特征和果实成熟特征结果
注:红色边框为专家目视判断结果,绿色锚点为自动判断结果。
Note: Red frame is the result of expert visual judgment, green anchor point is the result of automatic judgment.
(1)在学习率固定的情况下,选取重要超参数batch size和iterations,按照网格搜索方式寻找最佳参数组合,当batch size和iterations分别取值64和20000时,在测试集上对枸杞花和果实的识别效果好于其它参数。
(2)基于Faster R-CNN自动识别的枸杞开花和果实成熟普遍期结果与田间观测结果相差0~12d,导致差异较大的原因一是两种方法的判识对象不同,二是观测标准不一致。由于差异无法从根本上避免,所以难以利用田间观测结果对自动识别算法进行调整或优化。
(3)基于Faster R-CNN自动识别结果与专家目视判断结果相差2~5d,两种方式都以图像作为判识对象,在判断标准上也一致,具有更强的可比性,存在差异的主要原因是自动识别算法提取的特征存在缺陷,可以用专家目视判断的结果加以优化。
随着服务需求的变化和各项技术的发展,自动观测将是农业气象观测发展的一个趋势。基于图像识别技术的发育期自动判识如果想要代替现有的田间观测,首先需要足够准确的算法,还同时需要一套不同于田间观测的规范和标准。从本研究结果看,田间观测和图像识别的结果有着难以解决的差异,后期应该在开展识别技术研究的同时推进规范和标准的制定。
判断一幅图像是否达到某发育期时,本研究设定当一幅图像中具有5个特征点时即算作达到了普遍期,这一点是根据前期的观测经验给出的标准,该标准的适用性还需进一步讨论。另外,本研究认为现行的田间观测规范中判断地段作物群体进入发育期的原则在自动观测中仍然可以遵循,以观测的总图像数中进入发育普遍期的图像数所占的百分率确定。
[1]王纯枝,毛留喜,杨晓光,等.黄淮海地区冬小麦农业气象指标体系的构建[J].气象与环境科学,2019,42(1):3-10.
Wang C Z,Mao L X,Yang X G,et al.Construction of agrometeorological index system for winter wheat in HuangHuaiHai region[J].Meteorological and Environmental Sciences,2019,42(1):3-10.(in Chinese)
[2]中国气象局.农业气象观测规范[M].北京:气象出版社,1993:9-10.
China Meteorological Administration.Specification for agrometeorological observation[M].Beijing:China Meteorological Press,1993:9-10. (in Chinese)
[3]胡萌琦.普及型生态:农业气象自动观测方法研究与应用[D].南京:南京农业大学,2011.
Hu M Q.Study of universal eco-meteorological and agro-meteorological automatic observation method[D]. Nanjing:Nanjing Agriculture University,2011.(in Chinese)
[4]张雪芬,薛红喜,孙涵,等.自动农业气象观测系统功能与设计[J].应用气象学报,2012,23(1):105-112.
Zhang X F,Xue H X,Sun H,et al.Function and designing of automatic system for agro-meteorology[J].Journal of Applied Meteorological Science,2012,23(1):105-112.(in Chinese)
[5]李涛,吴东丽,胡锦涛,等.基于深度学习的玉米抽雄期判识[J].电子测量技术,2019,42(11):102-106.
Li T,Wu D L,Hu J T,et al.Maize tasseling period recognition based on deep learning[J].Electronic Measurement Technology,2019,42(11):102-106.(in Chinese)
[6]陆明,申双和,王春艳,等.基于图像识别技术的夏玉米生育期识别方法初探[J].中国农业气象,2011,32(3):423-429.
Lu M,Shen S H,Wang C Y,et al.Initial exploration of maize phenological stage based on image recognition[J].Chinese Journal of Agrometeorology,2011,32(3):423-429.(in Chinese)
[7]刘阗宇,冯全,杨森.基于卷积神经网络的葡萄叶片病害检测方法[J].东北农业大学学报,2018,49(3):73-83.
Liu T Y,Feng Q,Yang S.Detecting grape diseases based on convolutional neural network[J].Journal of Northeast Agricultural University,2018,49(3):73-83.(in Chinese)
[8]刘永娟.基于计算机视觉技术的玉米生育期识别研究[D].无锡:江南大学,2017.
Liy Y J.Research on recognition of corn growth period based on computer vision technology[D].Wuxi:Jiangnan University,2017.(in Chinese)
[9]熊俊涛,戴森鑫,区炯洪,等.基于深度学习的大豆生长期叶片缺素症状检测方法研究[J/OL].农业机械学报,http://kns.cnki.net/kcms/detail/11.1964.S.20191112.0943.004.html.
Xiong J T,Dai S X,Qu J H,et al.Leaf deficiency symptoms detection of soybean based on deep learning[J/OL]. Transactions of the Chinese Society for Agricultural Machinery,http://kns.cnki.net/kcms/detail/11.1964.S.20191112.0943.004.html.(in Chinese)
[10]张博,张苗辉,陈运忠.基于空间金字塔池化和深度卷积神经网络的作物害虫识别[J].农业工程学报,2019,35(19): 209-215.
Zhang B,Zhang M H,Chen Y Z.Crop pest identification based on spatial pyramid pooling and deep convolution neural network[J].Transactions of the CSAE,2019,35(19): 209-215.(in Chinese)
[11]王彦翔,张艳,杨成娅,等.基于深度学习的农作物病害图像识别技术进展[J].浙江农业学报,2019,31(4):669-676.
Wang Y X,Zhang Y,Yang C Y,et al.Advances in new nondestructive detection and identification techniques of crop diseases based on deep learning[J].Acta Agriculturae Zhejiangensis,2019,31(4):669-676.(in Chinese)
[12]梁胤豪,陈全,董彩霞,等.基于深度学习和无人机遥感技术的玉米雄穗检测研究[J].福建农业学报,2020,35(4): 456-464.
Liang Y H,Chen Q,Dong C X,et al.Application of deep-learning and UAV for field surveying corn tassel[J].Fujian Journal of Agricultural,2020,35(4):456-464. (in Chinese)
[13]邱锡鹏.神经网络与深度学习[M].北京:机械工业出版社,2020:114.
Qiu X P.Neural networks and deep learning[M].Beijing: China Machine Press,2020:114.(in Chinese)
[14]陈思聪.深度学习技术及在花卉识别中的应用[J].信息与电脑,2019(21):70-72.
Chen S C.Deep learning technology and its application in flower recognition[J].China Computer & Communication, 2019(21):70-72.(in Chinese)
[15]孙哲,张春龙,葛鲁镇,等.基于Faster R-CNN的田间西兰花幼苗图像检测方法[J].农业机械学报,2019,50(7): 216-221.
Sun Z,Zhang C L,Ge L Z,et al.Image detection method for broccoli seedlings in field based on faster R-CNN[J]. Transactions of the Chinese Society for Agricultural Machinery,2019,50(7):216-221.(in Chinese)
[16]席芮,姜凯,张万枝,等.基于改进Faster R-CNN的马铃薯芽眼识别方法[J].农业机械学报,2020,51(4):216-223.
Xi R,Jiang K,Zhang W Z.et al.Recognition method for potato buds based on improved faster R-CNN[J]. Transactions of the Chinese Society for Agricultural Machinery,2020, 51(4):216-223. (in Chinese)
[17]王静涛,宋文龙,李克新,等.依据Faster R-CNN的活体植株叶片气孔检测方法[J].东北林业大学学报,2020,48(2): 34-39.
Wang J T,Song W L,Li K X,et al.A Stoma detection method for living plant leaves with faster R-CNN[J]. Journal of Northeast Forestry University,2020,48(2):34-39. (in Chinese)
[18]Shu Q T,Jing P,Hao F L,et al.Passion fruit detection and counting based on multiple scale Faster R-CNN using RGB-D images[J].Agriculture Week,2020,21(11):1072-1091.
[19]李就好,林乐坚,田凯,等.改进Faster R-CNN的田间苦瓜叶部病害检测[J].农业工程学报,2020,36(12):179-185.
Li J H,Lin L J,Tian K,et al.Detection of leaf diseases of balsam pear in the field based on improved Faster R-CNN[J].Transactions of the CSAE,2020,36(12):179-185. (in Chinese)
[20]李春明,逯杉婷,远松灵,等.基于Faster R-CNN的除草机器人杂草识别算法[J].中国农机化学报,2019,40(12): 171-176.
Li C M,Lu S T,Yuan S L,et al.Weed identification algorithm of weeding robot based on Faster R-CNN[J]. Journal of Chinese Agricultural Mechanization,2019, 40(12):171-176.(in Chinese)
[21]张乐,金秀,傅雷扬,等.基于Faster R-CNN深度网络的油菜田间杂草识别方法[J].激光与光电子学进展,2020, 57(2): 304-312.
Zhang L,Jin X,Fu L Y,et al.Recognition approaches of weeds in rapeseed field based on Faster R-CNN deep network[J].Laser & Optoelectronics Progress,2020,57(2): 304-312.(in Chinese)
[22]彭明霞,夏俊芳,彭辉.融合FPN的Faster R-CNN复杂背景下棉田杂草高效识别方法[J].农业工程学报,2019, 35(20):202-209.
Peng M X,Xia J F,Peng H.Efficient recognition of cotton and weed in field based on Faster R-CNN by integrating FPN[J].Transactions of the CSAE,2019,35(20):202-209.(in Chinese)
[23]Wang J W,Luo H B,Yu P F,et al.Small objects detection in UAV aerial images based on improved Faster R-CNN[J]. Journal of Measurement Science and Instrumentation, 2020,1(11):11-16.
[24]周兵,李润鑫,尚振宏,等.基于改进的Faster R-CNN目标检测算法[J].激光与光电子学进展,2020,57(10):105-112.
Zhou B,Li R X,Shang Z H,et al.Object detection algorithm based on improved Faster R-CNN[J].Laser & Optoelectronics Progress,2020,57(10):105-112.(in Chinese)
[25]刘静,马力文,李润怀,等.农业气象观测规范枸杞:QX/T 282-2015[S].中国气象局,2015.
Liu J,Ma L W,Li R H,et al.Specification for agrometeorological observation:QX/T 282-2015[S].China Meteorological Administration,2015. (in Chinese)
[26]Ren S Q,He K M,Girshick R,et al.Faster R-CNN:towards real-time object detection with region proposal networks[J]. Transactions on Pattern Analysis and Machine Intelligence, 2017,39(6):1137-1149.
[27]Simonyan K,Zisserman A.Very deep convolutional networks for large-scale image recognition[J/OL]. ComputerScience,2014 [2014-09-01]. https://arxiv.org/abs/ 1409.1556.
Automatic Identification Technology ofFlowering Period and Fruit Ripening Period Based on Faster R-CNN
ZHU Yong-ning1,3,4, ZHOU Wang2, YANG Yang1,3,4, LI Jian-ping1,3,4, LI ,Wan-chun1,3,4, JIN Hong-wei2, FANG Feng2
(1.Key Laboratory for Meteorological Disaster Monitoring and Early Warning and Risk Management of Characteristic Agriculture in Arid Regions, CMA, Ningxia Yinchuan 750002, China; 2. Aerospace Newsky Technology Co.Ltd, Wuxi 214000; 3. Key Laboratory of Meteorological Disaster Prevention and Reduction of Ningxia, Yinchuan 750002; 4.Ningxia Meteorological Science Institute, Yinchuan 750002)
From 2018 to 2019, 16 sets offarmland monitoring systems had been built in Ningxia. Each system took 10 images every day, and over 30,000 images of the growth oftrees were taken in two years. To study the recognition technology of the flowering period and fruit ripening period ofbased on these images, three methods were used in this paper to judge the developmental stage of. The first one was the field observation method. In this method, two fields where the real-life monitoring system was installed were selected, and thetrees in the two fields were manually observed once in every two days during the growing season. Thetrees selected by manual observation should be consistent with the ones photographed by the farmland monitoring systems. The second method was expert visual judgment, in which 5 experienced experts were invited to judge all the images. The judgment standard was as follows. If there were 5 features in a certain developmental period in an image, it was considered that thistree had reached the universal period of this developmental period. If 5 out of 10 images on a certain day reached the universal period of this developmental period, it was considered that thepopulation in the filed had entered this developmental period. Based on the opinions of the experts, the result of the expert visual judgment was given. The third method is the automatic recognition method. In this method, more than 3000 images with characteristics offlowering and fruit ripening were screened out from all the images. Removed the images with lens fouling or unsatisfactory field of view, and finally, the number of remaining image samples was 1210. To avoid the phenomenon of underfitting or overfitting due to too few or too many images of a certain category involved in training, rotation, cropping and flipping were used for data enhancement. The data enhanced samples were divided according to the format of the PASCAL VOC2007 data set. Finally, a total of 7260 experimental samples were obtained, including 5808 images in the training set and 1452 images in the test set. According to the significant image characteristics ofin the flowering and fruit ripening periods, the labelImg label tool was used to label all the flowers and fruits in the image samples, marking 12100 ‘flower’ labels and 11602 ‘fruit’ labels. Then, faster region-based convolutional neural network (Faster R-CNN) was utilized to train and classify the selected images, and to construct the algorithm for identifying the flowering period and fruit ripening period of. In the constructed algorithm, the judgment standard was the same as that in the second method, and the time series judgment was introduced when judging the different stages of flowering or fruit ripening. Taking AP and mAP as the evaluation indicators of the automatic recognition model, the results showed that the mAP value could reach 0.74 on the test set when the important hyperparameters batch size and the number of iterations in the network structure were set to be 64 and 20000 respectively, which outperforms other hyperparameters setting. Comparing the results of the three methods, it could be found that the difference between the automatic recognition results and the field observation records during the same period was 0-12 days. The main reason for the difference was that the observation objects and standards of the two methods were inconsistent. The observation object of the automatic recognition method was a two-dimensional image, and it could not be judged when the feature was occluded. The object of field observation is thetree, which is not affected by occlusion. Besides, the standards of these two methods were different. The standard of the automatic recognition method was based on the number of feature points observed in the image, while the field observation method was based on the ratio of the observed feature points to the expected feature points of thetree that could not be obtained in the automatic recognition method. The difference between the two methods could not be eliminated fundamentally, so it was difficult to optimize the automatic recognition algorithm using the results of the field observations method. The comparison results also showed that the difference between the automatic recognition results and the expert visual judgments was within 2-5d. The judgment objects and standards of these two methods were consistent, so the results were highly comparable. The results of expert visual judgment could be used as the verification standard to optimize and adjust the automatic recognition method.
;Flowering period recognition; Fruit ripening period recognition; Growth stages recognition;Faster R-CNN;Automatic image recognition
10.3969/j.issn.1000-6362.2020.10.006
朱永宁,周望,杨洋,等.基于Faster R-CNN的枸杞开花期与果实成熟期识别技术[J].中国农业气象,2020,41(10):668-677
2020-05-20
周望,E-mail:zhou.wang@js1959.com
中国气象局旱区特色农业气象灾害监测预警与风险管理重点实验室开放研究基金(CAMF-201813);第四批宁夏青年科技人才托举工程项目(TJGC2019058);宁夏回族自治区重点研发计划(2019BEH03008);宁夏回族自治区重点研发项目(2017BY080)
朱永宁,E-mail:zhuyongning.007@163.com
我们致力于保护作者版权,注重分享,被刊用文章因无法核实真实出处,未能及时与作者取得联系,或有版权异议的,请联系管理员,我们会立即处理! 部分文章是来自各大过期杂志,内容仅供学习参考,不准确地方联系删除处理!