During the 28th International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES 2024), Dr. Włodzimierz Lewoniewski presented a paper entitled “Exploring the Challenges and Potential of Generative AI: Insights from an Empirical Study” and a paper entitled “Supporting fact-checking process with IT tools”. The conference was held on September 11-13, 2024 in hybrid mode.
Exploring the Challenges and Potential of Generative AI: Insights from an Empirical Study
As part of the paper “Exploring the Challenges and Potential of Generative AI: Insights from an Empirical Study”, an experiment was conducted to investigate to what extent can content generated by GAI be considered reliable (not containing false information) and easy is it to persuade GAI to generate false information.
Based on GPT (Generative Pre-trained Transformers) models, ChatGPT is an example of a generative artificial intelligence (GenAI) tool. ChatGPT gained its first million users in less than a week and reached 100 million users in two months. ChatGPT belongs to a broader group of tools called large language models (LLMs). Based on huge corpuses of text, such tools can generate content that perfectly imitates the way humans use natural language to convey information. Many people have started using these tools to obtain information as an alternative to traditional Internet search engines.
However, GenAI tools work differently than simple search engines. They already have encoded knowledge and instead of searching for texts using key phrases, they provide knowledge in the form of well-constructed answers adapted to the situation, based on the questions asked. In connection with this, researchers ask themselves: can AI serve as a tool for verifying information and does the generated text contain only true information?
As part of the experiment, a group of students was tasked with preparing essays using generative AI tools. The essays were to be written in Polish, contain a specified number of characters and concern predefined thematic areas. In addition to “generating” the essay, it was also necessary to write its own critical assessment according to specific criteria. First, the students were to assess the credibility of the text generated by AI, i.e. verify whether the text contained any false or incorrect information. Second, they were asked to induce GenAI to include invented statements in the generated text, i.e. to generate fake news.
At the time of conducting the experiment, it was known that large language models can be subject to so-called hallucinations, i.e. spontaneously generate false content. Students were therefore required to detect and correct such errors. This allowed for the assessment of whether available GenAI tools could potentially be used for tasks related to information verification and fake news detection.
Authors of the paper: Dr. Milena Stróżyna, Prof. Krzysztof Węcel, Dr. Piotr Stolarski, Ewelina Księżniak, Marcin Sawiński, Dr. Włodzimierz Lewoniewski, Prof. Witold Abramowicz.
Supporting fact-checking process with IT tool
The paper “Supporting fact-checking process with IT tools” presents a detailed analysis of the fact-checking process, based on a literature review and interviews with organizations specializing in this field. This combined theoretical knowledge with practical experiences of experts, which allowed to identify research gaps and challenges in current information verification methods.
Disinformation and so-called “fake news” pose a serious threat to society, influencing public opinion, political decisions and our daily lives. Therefore, research on the fact-checking process is extremely important. Initially, fact-checking consisted of checking all information in press articles before their publication, which is a basic obligation of journalists. Nowadays, the term also refers to analyzing content after its publication, especially on the Internet. Increasingly, fact-checking is carried out by people and organizations unrelated to the author of the verified information.
The paper presents a comprehensive fact-checking process along with a list of potential IT tools that can improve it. Based on the analyses conducted, a reference model of fact-checking was developed, describing the main stages of the process, which allows for a better understanding and standardization of actions in the fight against disinformation. The work also includes an overview of available IT tools supporting the verification process, which also served to identify technological gaps.
A key element of the work is the description of the SocialScan tool, which was developed as part of the OpenFact project and which allows for the identification of potential fake news in social media. This tool was created to support fact-checkers in identifying information requiring verification more quickly and effectively. Consequently, it may contribute to limiting the spread of fake news.
Authors of the paper: Marcin Sawiński, Dr. Milena Stróżyna, Dr. Włodzimierz Lewoniewski, Dr. Piotr Stolarski, Prof. Krzysztof Węcel, Ewelina Księżniak, Prof. Witold Abramowicz.
OpenFact
The Department of Information Systems is currently implementing the OpenFact research project, headed by Prof. Witold Abramowicz. As part of this project, tools for automatic detection of fake news in Polish are being developed. In July 2024, the results of the OpenFact project were rated the highest by National Center for Research and Development for the second year in a row.
The OpenFact project is financed by the National Centre for Research and Development under the INFOSTRATEG I program “Advanced information, telecommunications and mechatronic technologies”.
Participation in the conference was possible thanks to funding from the competition for financing conference trips called “RIGE – conferences”. Supported by funds granted by the Minister of Science of the Republic of Poland under the „Regional Initiative for Excellence” Programme for the implementation of the project “The Poznań University of Economics and Business for Economy 5.0: Regional Initiative – Global Effects (RIGE)”.