Revisiting Business Process Analysis through the lens of Large Language Models: Prompting experiments with BPMN process serializations

Tracking #: 788-1779

Flag : Review Assignment Stage

Authors: 

Damaris Dolha
Ana-Maria Ghiran
Robert Buchmann

Responsible editor: 

Guest Editors Neuro-Symbolic AI and Conceptual Modeling 2024

Submission Type: 

Article in Special Issue (note in cover letter)

Full PDF Version: 

Cover Letter: 

Dear Editors, We hereby submit the manuscript titled Revisiting Business Process Analysis through the lens of Large Language Models: Prompting experiments with BPMN process serializations, to be considered for publication in Neuro-symbolic AI Journal, the special issue on Neuro-Symbolic AI and Domain Specific Conceptual Modelling. The paper reports on comparative prompting experiments with alternative process representations of the same BPMN models: the standard BPMN XML serialization (as exported from the Signavio toolset) and the non-standard semantic RDF graph serialization (available as export format from the Bee-Up modeling toolkit).The ability of ChatGPT 4 to answer process queries is compared through a series of prompting experiments on full explicit process models and on minimalist BPMN patterns with generic labelling. Quality of answers is evaluated using the RAGAs framework. The work was motivated by a need to revisit the BPM lifecycle through the lens of what Large Language Models can bring to the different phases of the lifecycle – for now we focus on the Process Analysis phase. The submission extends a conference paper that was presented at BIR 2024 (Perspectives on Business Informatics Research) in September 2024: https://link.springer.com/chapter/10.1007/978-3-031-71333-0_2 The extension in this journal version pertains to - experiments on a variety of minimalist BPMN patterns (whereas in the conference only one end-to-end process exemplar was discussed) - insight on the differences between the RDF and XML serializations of BPMN - structurally and how it reflects into the linear text-based serialization We look forward to hearing from you, Damaris Dolha, Ana-Maria Ghiran, Robert Buchmann

Tags: 

  • Under Review