Chenming LiXiaoyu QuYao YangHongmin GaoYongchang WangDan YaoWenjing Yuan
This study proposes a new activation function, namely, S-type rectified linear unit activation function (SReLU), to alleviate the gradientdispersion of neural network model and improve the segmentationprecision of high-resolution remote sensing images (HRIs).
Xiaofeng LiShuqing ZhangQiang LiuBai ZhangDianwei LiuBibo LuXiaodong Na
Lin WuZhaoxiang ZhangYunhong WangQingjie Liu
Hongjie TaoZhaofei LiFei QiJingjue ChenHao Zhou
Xia HuaXinqing WangTing RuiFaming ShaoDong Wang
Chengjie ZhuShizhi YangShengcheng CuiWei ChengCheng Chen