Please use this identifier to cite or link to this item: https://hdl.handle.net/10316/113243
DC FieldValueLanguage
dc.contributor.authorDuarte, Laura-
dc.contributor.authorNeto, Pedro-
dc.date.accessioned2024-02-09T11:26:02Z-
dc.date.available2024-02-09T11:26:02Z-
dc.date.issued2023-03-15-
dc.identifier.issn02786125pt
dc.identifier.urihttps://hdl.handle.net/10316/113243-
dc.description.abstractCollaborative robots are increasingly present in industry to support human activities. However, to make the human-robot collaborative process more effective, there are several challenges to be addressed. Collaborative robotic systems need to be aware of the human activities to (1) anticipate collaborative/assistive actions, (2) learn by demonstration, and (3) activate safety procedures in shared workspace. This study proposes an action classification system to recognize primitive assembly tasks from human motion events data captured by a Dynamic and Active-pixel Vision Sensor (DAVIS). Several filters are compared and combined to remove event data noise. Task patterns are classified from a continuous stream of event data using advanced deep learning and recurrent networks to classify spatial and temporal features. Experiments were conducted on a novel dataset, the dataset of manufacturing tasks (DMT22), featuring 5 classes of representative manufacturing primitives (PickUp, Place, Screw, Hold, Idle) from 5 participants. Results show that the proposed filters remove about 65\% of all events (noise) per recording, conducting to a classification accuracy up to 99,37\% for subjects that trained the system and 97.08\% for new subjects. Data from a left-handed subject were successfully classified using only right-handed training data. These results are object independent.pt
dc.language.isoengpt
dc.publisherElsevierpt
dc.relation2021.06508.BDpt
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB/00285/2020/PT/Centre for Mechanical Enginnering, Materials and Processespt
dc.rightsopenAccesspt
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/pt
dc.subjectTask classificationpt
dc.subjectManufacturingpt
dc.subjectEvent datapt
dc.subjectDeep learningpt
dc.subjectCollaborative roboticspt
dc.titleClassification of Primitive Manufacturing Tasks from Filtered Event Datapt
dc.typearticle-
degois.publication.firstPage12pt
degois.publication.lastPage24pt
degois.publication.titleJournal of Manufacturing Systemspt
dc.peerreviewedyespt
dc.identifier.doi10.1016/j.jmsy.2023.03.001pt
degois.publication.volume68pt
dc.date.embargo2023-03-15*
uc.date.periodoEmbargo0pt
item.grantfulltextopen-
item.cerifentitytypePublications-
item.languageiso639-1en-
item.openairetypearticle-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.fulltextCom Texto completo-
crisitem.author.researchunitCEMMPRE - Centre for Mechanical Engineering, Materials and Processes-
crisitem.author.researchunitCEMMPRE - Centre for Mechanical Engineering, Materials and Processes-
crisitem.author.orcid0000-0001-8055-2865-
crisitem.author.orcid0000-0003-2177-5078-
Appears in Collections:I&D CEMMPRE - Artigos em Revistas Internacionais
Files in This Item:
Show simple item record

Page view(s)

33
checked on May 8, 2024

Download(s)

13
checked on May 8, 2024

Google ScholarTM

Check

Altmetric

Altmetric


This item is licensed under a Creative Commons License Creative Commons