参考文献/References:
[1] 韩伟聪, 鲍光海. 基于机器视觉的竹材尺寸测量系统设计[J]. 中国测试, 2016, 42(7): 74-78.[2] 张萍萍, 李童, 李茹, 等. 一种改进的Sobel图像边缘检测算法及其实现[J]. 电视技术, 2022, 46(5): 42-45.[3] 李迪, 吴奇, 杨浩森. 基于改进Sobel算子的边缘检测系统的设计与实现[J]. 信息技术与网络安全, 2022, 41(3): 13-17.[4] 位营杰, 师红宇. 基于Canny算子的优化研究[J]. 国外电子测量技术, 2021, 40(8): 77-81.[5] 杜绪伟, 陈东, 马兆昆, 等. 基于Canny算子的改进图像边缘检测算法[J]. 计算机与数字工程, 2022, 50(2): 410-413, 457.[6] 徐武, 张强, 王欣达, 等. 基于改进Canny算子的图像边缘检测方法[J]. 激光杂志, 2022, 43(4): 103-108.[7] 朱森荣, 刘杰徽. 基于最小二乘法椭圆拟合的改进型快速算法[J]. 舰船电子工程, 2022, 42(1): 33-35.[8] 夏海波. 基于Visual C++的图像增强和轮廓提取研究[J]. 工矿自动化, 2011, 37(3): 44-47.[9] 鲍华良, 赵娅. 经典Canny边缘检测的量子实现[J]. 吉林大学学报(信息科学版), 2022, 40(1): 36-50.[10] GANDER W, GOLUB G H, STREBEL R. Least-squares fitting of circles and ellipses[J]. BIT Numerical Mathematics, 1994, 34(4): 558-578.
相似文献/References:
[1]黄靖、李俊男、刘丽桑、罗堪、夏正邦、王泽洲.基于形态学重建与OTSU的极耳焊缝图像分割方法[J].福建工程学院学报,2019,17(04):359.[doi:10.3969/j.issn.1672-4348.2019.04.009]
HUANG Jing,LI Junnan,LIU Lisang,et al.Polar ear weld image segmentation method based onmorphological reconstruction and OTSU[J].Journal of FuJian University of Technology,2019,17(06):359.[doi:10.3969/j.issn.1672-4348.2019.04.009]
[2]檀甫贵、邹复民、刘丽桑、李建兴.基于机器视觉的软包锂电池表面缺陷检测[J].福建工程学院学报,2020,18(03):267.[doi:10.3969/j.issn.1672-4348.2020.03.012]
TAN Fugui,ZOU Fumin,LIU Lisang,et al.Surface defect detection of soft-pack lithium battery based on machine vision[J].Journal of FuJian University of Technology,2020,18(06):267.[doi:10.3969/j.issn.1672-4348.2020.03.012]
[3]戴福全、刘路杰.基于视觉引导的机器人抓取分类系统设计[J].福建工程学院学报,2020,18(06):530.[doi:10.3969/j.issn.1672-4348.2020.06.004]
DAI Fuquan,LIU Lujie.Design of vision-guided robotic grasping classification system[J].Journal of FuJian University of Technology,2020,18(06):530.[doi:10.3969/j.issn.1672-4348.2020.06.004]
[4]林亚君、陈学军.基于Ostu优化的PCNN电力故障区域提取[J].福建工程学院学报,2020,18(06):593.[doi:10.3969/j.issn.1672-4348.2020.06.015]
LIN Yajun,CHEN Xuejun.Fault zone extraction of electrical equipment by using PCNN based on Otsu optimization[J].Journal of FuJian University of Technology,2020,18(06):593.[doi:10.3969/j.issn.1672-4348.2020.06.015]
[5]李建兴、林华良、俞斌、陈炜、林晨煌、黄诗婷.基于机器视觉的汽车角窗玻璃混线检测算法[J].福建工程学院学报,2021,19(03):223.[doi:10.3969/j.issn.1672-4348.2021.03.004]
LI Jianxing,LIN Hualiang,YU Bin,et al.Machine vision-based non-congeneric product detection algorithm for vehicle quarter glass[J].Journal of FuJian University of Technology,2021,19(06):223.[doi:10.3969/j.issn.1672-4348.2021.03.004]
[6]张平均,翁悦,王小红,等.基于改进UNet的人造板表面缺陷的图像分割方法[J].福建工程学院学报,2022,20(04):373.[doi:10.3969/j.issn.1672-4348.2022.04.011]
ZHANG Pingjun,WENG Yue,WANG Xiaohong,et al.Image segmentation method of surface defects of wood-based panels based on improved UNet[J].Journal of FuJian University of Technology,2022,20(06):373.[doi:10.3969/j.issn.1672-4348.2022.04.011]
[7]李济泽,位威,张凯凯.基于机器视觉的养殖鱼摄食行为识别方法[J].福建工程学院学报,2022,20(04):378.[doi:10.3969/j.issn.1672-4348.2022.04.012]
LI Jize,WEI Wei,ZHANG Kaikai.Recognition method of fish feeding behavior based on machine vision[J].Journal of FuJian University of Technology,2022,20(06):378.[doi:10.3969/j.issn.1672-4348.2022.04.012]
[8]仓大健,吴选忠,李占福.基于R-Y通道冗余去除钢筋层阴影方法[J].福建工程学院学报,2022,20(04):391.[doi:10.3969/j.issn.1672-4348.2022.04.014]
CANG Dajian,WU Xuanzhong,LI Zhanfu.Spatial ghosting elimination method of reinforcement layer based on R-Y channel redundancy removal[J].Journal of FuJian University of Technology,2022,20(06):391.[doi:10.3969/j.issn.1672-4348.2022.04.014]
[9]戚云涛,曾寿金,方宇轩,等.基于粒子群优化的双凸透镜缺陷Ostu阈值分割算法[J].福建工程学院学报,2023,21(06):598.[doi:10.3969/j.issn.1672-4348.2023.06.014]
QI Yuntao,ZENG Shoujin,FANG Yuxuan,et al.Otsu threshold segmentation algorithm for defects ofdouble convex lens based on particle swarm optimization[J].Journal of FuJian University of Technology,2023,21(06):598.[doi:10.3969/j.issn.1672-4348.2023.06.014]
[10]董世超,陈丙三,连长伟,等.基于机器视觉的烟梗长度测量方法[J].福建工程学院学报,2024,22(01):74.[doi:10.3969/j.issn.2097-3853.2024.01.011]
DONG Shichao,CHEN Bingsan,LIAN Changwei,et al.Measurement method of tobacco stem length based on machine vision[J].Journal of FuJian University of Technology,2024,22(06):74.[doi:10.3969/j.issn.2097-3853.2024.01.011]