Manipulación visual-táctil para la recogida de residuos domésticos en exteriores

  1. Castaño-Amorós, Julio 1
  2. Páez-Ubieta, Ignacio de Loyola 1
  3. Gil, Pablo 2
  4. Puente, Santiago Timoteo 2
  1. 1 Universidad de Alicante ; Universidad Miguel Hernández
  2. 2 Universitat d'Alacant
    info

    Universitat d'Alacant

    Alicante, España

    ROR https://ror.org/05t8bcz72

Journal:
Revista iberoamericana de automática e informática industrial ( RIAI )

ISSN: 1697-7920

Year of publication: 2023

Volume: 20

Issue: 2

Pages: 163-174

Type: Article

DOI: 10.4995/RIAI.2022.18534 DIALNET GOOGLE SCHOLAR lock_openOpen access editor

More publications in: Revista iberoamericana de automática e informática industrial ( RIAI )

Sustainable development goals

Abstract

This work presents a perception system applied to robotic manipulation, that is able to assist in navegation, household waste classification and collection in outdoor environments. This system is made up of optical tactile sensors, RGBD cameras and a LiDAR. These sensors are integrated on a mobile platform with a robot manipulator and a robotic gripper. Our system is divided in three software modules, two of them are vision-based and the last one is tactile-based. The vision-based modules use CNNs to localize and recognize solid household waste, together with the grasping points estimation. The tactile-based module, which also uses CNNs and image processing, adjusts the gripper opening to control the grasping from touch data. Our proposal achieves localization errors around 6 %, a recognition accuracy of 98% and ensures the grasping stability the 91% of the attempts. The sum of runtimes of the three modules is less than 750 ms.

Bibliographic References

  • Altikat, A., Gulbe, A., Altikat, S., 2022. Intelligent solid waste classification using deep convolutional neural networks. Int. J. Environmental Science and Technology 19, 1285-1292. https://doi.org/10.1007/s13762-021-03179-4
  • Bircanoglu, C., Atay, M.and Beser, F., Genc¸, , Kızrak, M. A., 2018. Recyclenet: Intelligent waste sorting using deep neural networks. In: Innovations in intelligent systems and applications. pp. 1-7. https://doi.org/10.1109/INISTA.2018.8466276
  • Bohg, J., Morales, A., Asfour, T., Kragic, D., 2013. Data-driven grasp synthesis- a survey. IEEE Transactions on robotics 30 (2), 289-309. https://doi.org/10.1109/TRO.2013.2289018
  • Bolya, D., Zhou, C., Xiao, F., Lee, Y., 2019. Yolact: Real-time instance segmentation. In: IEEE/CVF Int. Conf. on Computer Vision. pp. 9157-9166. https://doi.org/10.1109/ICCV.2019.00925
  • Castaño-Amoros, J., Gil, P., Puente, S., 2021. Touch detection with low-cost visual-based sensor. In: 2nd Int. Conf. on Robotics, Computer Vision and Intelligent Systems. pp. 136-142. https://doi.org/10.5220/0010699800003061
  • De Gea, V., Puente, S., Gil, P., 2021. Domestic waste detection and grasping points for robotic picking up. 10.48550/arXiv.2105.06825, iEEE Int. Conf. on Robotics and Automation. Workshop: Emerging paradigms for robotic manipulation: from the lab to the productive world.
  • Del Pino, I., Muñoz-Bañon, M., Cova-Rocamora, S., Contreras, M., Candelas, F., Torres, F., 2020. Deeper in blue. Journal of Intelligent & Robotics Systems 98, 207-225. https://doi.org/10.1007/s10846-019-00983-6
  • Donlon, E., Dong, S., Liu, M., Li, J., Adelson, E., Rodriguez, A., 2018. Gelslim: A high-resolution, compact, robust, and calibrated tactile-sensing finger. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, pp. 1927-1934. https://doi.org/10.1109/IROS.2018.8593661
  • Feng, J., Tang, X., Jiang, X., Chen, Q., 2021. Garbage disposal of complex background based on deep learning with limited hardware resources. IEEE Sensors Journal 21(8), 21050-21058. https://doi.org/10.1109/JSEN.2021.3100636
  • Fu, B., Li, S., Wei, J., Li, Q., Wang, Q., T. J., 2021. A novel intelligent garbage classification system based on deep learning and an embedded linux system. IEEE Access 9), 131134-131146. https://doi.org/10.1109/ACCESS.2021.3114496
  • Guo, N., Zhang, B., Zhou, J., Zhan, K., Lai, S., 2020. Pose estimation and adaptable grasp configuration with point cloud registration and geometry understanding for fruit grasp planning. Computers and Electronics in Agriculture 179, 105818. https://doi.org/10.1016/j.compag.2020.105818
  • He, K., Zhang, X., Ren, S., Sun, J., 2021. Deep residual learning for image recognition. In: IEEE Conf. on Computer Vision And Pattern Recognition. https://doi.org/10.1109/CVPR.2016.90
  • Jiang, D., Li, G., Sun, Y., Hu, J., Yun, J., Liu, Y., 2021. Manipulator grabbing position detection with information fusion of color image and depth image using deep learning. Journal of Ambient Intelligence and Humanized Computing 12 (12), 10809-10822. https://doi.org/10.1007/s12652-020-02843-w
  • Kim, D., Li, A., Lee, J., 2021. Stable robotic grasping of multiple objects using deep neural networks. Robotica 39 (4), 735-748. https://doi.org/10.1017/S0263574720000703
  • Kiyokawa, T., Katayama, H., Tatsuta, Y., Takamatsu, J., Ogasawara, T., 2021. Robotic waste sorter with agile manipulation and quickly trainable detector. IEEE Access 9), 124616-124631. https://doi.org/10.1109/ACCESS.2021.3110795
  • Kolamuri, R., Si, Z., Zhang, Y., Agarwal, A., Yuan, W., 2021. Improving grasp stability with rotation measurement from tactile sensing. In: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, pp. 6809-6816. https://doi.org/10.1109/IROS51168.2021.9636488
  • Lambeta, Chou, P.-W., Tian, S., Yang, B., Maloon, B., Most, V., Stroud, D., Santos, R., B.-A., Kammerer, G., Jayaraman, D., Calandra, R., 2020. Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation. IEEE Robotics and Automation Letters 5(3), 3838-38451. https://doi.org/10.1109/LRA.2020.2977257
  • Lin, Y., Lloyd, J., Church, A., Lepora, N. F., 2022. Tactile gym 2.0: Sim-to-real deep reinforcement learning for comparing low-cost high-resolution robot touch. IEEE Robotics and Automation Letters 7 (4), 10754-10761. https://doi.org/10.1109/LRA.2022.3195195
  • Liu, L., Ouyang, W., Wang, X., Fieguth, P., Chen, J., Liu, X., Pietikainen, M., 2020. Deep learning for generic object detection: A survey. Int. J. of Computer Vision 128, 261--318. https://doi.org/10.1007/s11263-019-01247-4
  • Liu, Y., Jiang, D., Duan, H., Sun, Y., Li, G., Tao, B., Yun, J., Liu, Y., Chen, B., 2021. Dynamic gesture recognition algorithm based on 3d convolutional neural network. Computational Intelligence and Neuroscience 2021. https://doi.org/10.1155/2021/4828102
  • Minaee, S., Boykov, Y., Porikli, F., Plaza, A., Kehtarnavaz, N., Terzopoulos, D., 2020. Image segmentation using deep learning: A survey. IEEE Trans on Pattern Analysis and Machine Intelligence. https://doi.org/10.1109/TPAMI.2021.3059968
  • Newbury, R., Gu, M., Chumbley, L., Mousavian, A., Eppner, C., Leitner, J., Bohg, J., Morales, A., Asfour, T., Kragic, D., et al., 2022. Deep learning approaches to grasp synthesis: A review. arXiv preprint arXiv:2207.02556.
  • Patrizi, A., Gambosi, G., Zanzotto, F., 2021. Data augmentation using background replacement for automated sorting of littered waste. J. of Imaging 7(8), 144. https://doi.org/10.3390/jimaging7080144
  • Redmon, J., 2014. Darknet: Open source neural networks in c. http://pjreddie.com/darknet/.
  • Sahbani, A., El-Khoury, S., Bidaud, P., 2012. An overview of 3d object grasp synthesis algorithms. Robotics and Autonomous Systems 60 (3), 326-336. https://doi.org/10.1016/j.robot.2011.07.016
  • Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen, L.-C., 2018. Mobilenetv2: Inverted residuals and linear bottlenecks. In: IEEE Conf. on Computer Vision and Pattern Recognition. pp. 4510-4520. https://doi.org/10.1109/CVPR.2018.00474
  • Sandykbayeva, D., Kappassov, Z., Orazbayev, B., 2022. Vibrotouch: Active tactile sensor for contact detection and force sensing via vibrations. Sensors 22 (17). https://doi.org/10.3390/s22176456
  • Shaw-Cortez, W., Oetomo, D., Manzie, C., Choong, P., 2018. Tactile-based blind grasping: A discrete-time object manipulation controller for robotic hands. IEEE Robotics and Automation Letters 3 (2), 1064-1071. https://doi.org/10.1109/LRA.2018.2794612
  • Simonyan, K., Zisserman, A., 2015. Very deep convolutional networks for large-scale image recognition. In: 3rd Int. Conf. on Learning Representations. DOI: https://doi.org/10.48550/arXiv.1409.1556
  • Suárez, R., Palomo-Avellaneda, L., Martínez, J., Clos, D., García, N., 2020. Manipulador móvil, bibrazo y diestro con nuevas ruedas omnidireccionales. Revista Iberoamericana de Automática e Informática industrial 17 (1), 10-21. https://doi.org/10.4995/riai.2019.11422
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z., 2016. Rethinking the inception architecture for computer vision. In: IEEE Conf. on Computer Vision and Pattern Recognition. pp. 2818-2826. https://doi.org/10.1109/CVPR.2016.308
  • Velasco, E., Zapata-Impata, B. S., Gil, P., Torres, F., 2020. Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico. Revista Iberoamericana de Automática e Informática Industrial 17 (1) , 44-55. https://doi.org/10.4995/riai.2019.10923
  • Vo, A. H., Son, L., Vo, M., Le, T., 2019. A novel framework for trash classification using deep transfer learning. IEEE Access 7, 178631-178639. https://doi.org/10.1109/ACCESS.2019.2959033
  • Ward-Cherrier, B., Pestell, N., Cramphorn, L., Winstone, B., Giannaccini, M. E., Rossiter, J., Lepora, N. F., 2018. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies. Soft robotics 5 (2), 216-227. https://doi.org/10.1089/soro.2017.0052
  • Yao, T., Guo, X., Li, C., Qi, H., Lin, H., Liu, L., Dai, Y., Qu, L., Huang, Z., Liu, P., et al., 2020. Highly sensitive capacitive flexible 3d-force tactile sensors for robotic grasping and manipulation. Journal of Physics D: Applied Physics 53 (44), 445109. https://doi.org/10.1088/1361-6463/aba5c0
  • Yuan, W., Dong, S., Adelson, E. H., 2017. Gelsight: High-resolution robot tactile sensors for estimating geometry and force. Sensors 17 (12), 2762. https://doi.org/10.3390/s17122762
  • Zapata-Impata, B., Gil, P., Pomares, J., Torres, F., 2019a. Fast geometry-based computation of grasping points on three-dimensional point clouds. Int. J. of Advanced Robotic Systems, 1-18. https://doi.org/10.1177/1729881419831846
  • Zapata-Impata, B. S., Gil, P., Torres, F., 2019b. Learning spatio temporal tactile features with a convlstm for the direction of slip detection. Sensors 19 (3), 523. https://doi.org/10.3390/s19030523