This October was one of the busiest yet most rewarding months of the year for me. Among several work trips, the highlight was attending the 27th European Conference on Artificial Intelligence (ECAI 2024) in Santiago de Compostela, Spain. Celebrating its 50th anniversary, ECAI remains Europe’s premier venue for AI research and innovation, bringing together thought leaders, researchers, and industry professionals from around the world.
This year’s theme, “Celebrating the Past, Inspiring the Future,” captured the spirit of ECAI’s half-century legacy while driving forward-looking discussions on the next era of artificial intelligence. With over 1,500 participants from 59 countries (so not so very European conference anymore, but rather a global event) and a packed schedule of more than 150 events, among which:
“Towards Real-World Fact-Checking with Large Language Models” keynote talk by Iryna Gurevych, (Technische Universität Darmstadt), reflecting on advancements in using language models for verifying information in real time;
“Robots (Still) Need Humans in the Loop,” keynote talk by Iolanda Leite (KTH Royal Institute of Technology), who underscored the essential role humans play in AI-driven robotics, even as systems grow more autonomous;
“Economic Complexity: Using Machine Learning to Understand Economic Development” keynote talk by Cesar A. Hidalgo (Toulouse School of Economics & Corvinus University of Budapest) that examined how machine learning is transforming our understanding of economic trends and predictions.
These were accompanied with a range of panels, with a few sessions that stood out (my personal opinion though):
Economic Impact of AI: Threats and Opportunities with Jeremy Rollison (Microsoft Corporation), David Autor (MIT), and Raquel Jorge Ricart (Elcano Royal Institute) on AI’s potential to reshape labor markets and economies around the world;
AI Regulation: The European Scenario (Kilian Gross, Dr. Clara Neppel, IEEE, Beatriz Alvargonzalez Largo, European Commission, José Miguel Bello Villarino, ARC Centre of Excellence for Automated Decision-Making and Society), addressed regulatory considerations;
50th Anniversary Session on the History of AI in Europe paying tribute to AI’s history in Europe, with Luc Steels, Stefano Cerri, Fredrik Heintz, and Tony Cohn sharing reflections on past achievements and a “follow-up” on it in the Future of AI: The Next 50 Years with Fredrik Heintz, Iryna Gurevych, José Hernández-Orallo, Ann Nowe, Toby Walsh;
Designing Ethical and Trustworthy AI Research Policies for Horizon Europe centered on ethical standards and trustworthy AI research practices within the EU’s Horizon program, led by Mihalis Kritikos from the European Commission;
Funding your Scientific Research with the European Research Council (ERC) with Enrique Alba.
As part of this conference, I had pleasure of presenting a paper co-authored with my former student Jan-Erik Kalmus, based on his Master’s thesis, which I had the privilege of supervising. Our paper, “To Accept or Not to Accept? An IRT-TOE Framework to Understand Educators’ Resistance to Generative AI in Higher Education,” examined what barriers might prevent educators from adopting Generative AI tools in their classrooms? Since the public release of ChatGPT, there has been a lively debate about the potential benefits and challenges of integrating Generative AI in educational contexts. While the technology holds promise, it has also sparked concerns, particularly among educators. In the field of information systems, Technology Adoption models are often used to understand factors that encourage or inhibit the use of new technologies. However, many existing models focus primarily on acceptance drivers, often overlooking the unique barriers that educators face. This study seeks to fill that gap by developing a theoretical model specifically tailored to identify the barriers that may prevent educators—academic staff in particular—from integrating Generative AI into their teaching. Our approach builds on Innovation Resistance Theory, augmented by constructs from the Technology-Organization-Environment (TOE) framework. With the designed mixed-method measurement instrument, combining quantitative data with qualitative insights, to capture educators’ specific concerns around Generative AI adoption in higher education, our model has been applied in real-world settings, specifically focusing on Estonian higher education institutions. We examined whether academic staff at public universities in Estonia – often referred to as a “digital nation” – show reluctance toward Generative AI use in educational settings. Preliminary findings highlight several concerns unique to educators, which may shape how Generative AI is integrated into teaching practices.
A Heartfelt Thanks to ECAI’s Organizers – the European Association for Artificial Intelligence (EurAI), the Spanish Artificial Intelligence Society, CiTIUS (Research Centre on Intelligent Technologies), and, of course, the city of Santiago de Compostela for being such a welcoming place.
From 2021 to 2024, I had the privilege of being part of the “FAIR Metrics and Data Quality” Task Force, where we made strides in advancing the FAIR principles and improving data quality across the research community. Following the recent reorganization of these task forces, I am now excited to continue this work with the newly formed FAIR Metrics and Digital Objects Task Force, which, as we decided just yesterday, will be chaired by “the father” will be chaired by “the father” FAIR principles Mark Wilkinson and Elli Papadopoulou!
Our mission is to develop and implement metrics for the FAIR principles (Findability, Accessibility, Interoperability, and Reusability) that enhance the utility and impact of digital objects in research. This initiative is crucial in advancing open science and ensuring that scientific data is more accessible, reusable, and beneficial for the global research community.
As a member of this task force, we will be collaborating with a diverse group of experts to:
Identify the limitations of the current FAIR assessment, which is mainly focused on the FAIRness of the repository, for evaluating the discoverability and reusability of data, which is insufficient for assessing the capability of data to be federated.
Watch and promote initiatives (such as GREI, Signposting, RO-Crates, etc) to facilitate the definition of common metadata schemas and their interoperability.
Identify issues on data privacy, considering data usage, data access and data licensing and specification for machine-actionable data usage policies (e.g. ODRL)
Analyse the impact of provenance, especially in the context of federated environments.
Identify synergies with the Data Spaces initiative.
Define FAIR metrics according to the objectives of the task force.
Engage with research clusters, empowering them to implement data quality practices tailored to their unique contexts by actionable recommendations, like DQ indicators to ensure data quality, addressing areas, for example, AI training and input data.
I am eager to contribute to this new chapter and look forward to the — hopefully — impactful changes we will bring to the scientific community.
Stay tuned for updates on our progress and initiatives, whereas the progress we made in the past years is documented here !
Recently, the United Nations University announced the launch of the United Nations University EGOV’s repository platform – a centralized hub of specialized repositories tackling global challenges, which is dedicated to two topics – EGOV for Emergencies that provides a set of content on innovations in digital governance for emergency response, and Data for EGOVis the repository intended “to supports policymakers, decision-makers, researchers, and the community interested in digitally transforming the public sector through emerging technologies and data. The repository combines diverse academic documents, use cases, rankings, best practices, standards, benchmarking, portals, datasets, and pilot projects to support open data, quality and purpose of open data, application of data techniques analytics in the public sector, and making cities smarter. This repository results from the “INOV.EGOV-Digital Governance Innovation for Inclusive, Resilient and Sustainable Societies” project on the role of open data and data science technologies in the digital transformation of State and Public Administration institutions“. The latter, recommends 286 reading materials (reports, articles, standards etc.) I find to be very relevant for the above described, and highly recommend to surf through. However, what made me specially happy while browsing this collection, is the fact that five of these reading materials are articles (co-)authored by me. Therefore, considering that not always I keep track of what I conducted in past, let me use this opportunity to reflect on those studies, in case you had not came across them previously, as well as to refresh mine memories (some of them dated back to times, when I worked on my PhD thesis).
By the way, every article is accompanied with tags that enrich keywords by which that article was described by authors, with a particular attention being paid to main topics, incl. “data analytics”, “smart city”, “open data”, “sustainability” etc., where for “the latter”sustainability”, tagging based on the compliance with a specific Sustainable Development Goal (SDG) takes place, thereby allowing to filter out relevant articles by a specific SDG or find out what SDG does your article contributes, where although while conducting research I kept in mind some of those I find my research more suited with, for one of them (the last one in the list) I was pretty surprised to see that it is very SDGs-compliant, being compliant with 11 SDGs (SDG-2, SDG-3, SDG-6, SDG-7, SDG-9, SDG-11, SDG-13, SDG-14, SDG-15).
So, back to those studies that the United Nations University recommends…
A multi-perspective knowledge-driven approach for analysis of the demand side of the Open Government Data portal, which proposes a multi-perspective approach where an OGD portal is analyzed from (1) citizens’ perspective, (2) users’ perspective, (3) experts’ perspective, and (4) state of the art. By considering these perspectives, we can define how to improve the portal in question by focusing on its demand side. In view of the complexity of the analysis, we look for ways to simplify it by reusing data and knowledge on the subject, thereby proposing a knowledge-driven analysis that supports the idea under OGD – their reuse. Latvian open data portal is used as an example demonstrating how this analysis should be carried out, validating the proposed approach at the same time. We are aiming to find (1) the level of the citizens’ awareness of the portal existence and its quality by means of the simple survey, (2) the key challenges that may negatively affect users’ experience identified in the course of the usability analysis carried out by both users and experts, (3) combine these results with those already known from the external sources. These data serve as an input, while the output is the assessment of the current situation allowing defining corrective actions. Since the debates on the Latvian OGD portal serving as the use-case appear more frequently, this study also brings significant benefit at national level.
Transparency of open data ecosystems in smart cities: Definition and assessment of the maturity of transparency in 22 smart cities, which focuses on the issue of the transparency maturity of open data ecosystems seen as the key for the development and maintenance of sustainable, citizen-centered, and socially resilient smart cities. This study inspects smart cities’ data portals and assesses their compliance with transparency requirements for open (government) data. The expert assessment of 34 portals representing 22 smart cities, with 36 features, allowed us to rank them and determine their level of transparency maturity according to four predefined levels of maturity – developing, defined, managed, and integrated. In addition, recommendations for identifying and improving the current maturity level and specific features have been provided. An open data ecosystem in the smart city context has been conceptualized, and its key components were determined. Our definition considers the components of the data-centric and data-driven infrastructure using the systems theory approach. We have defined five predominant types of current open data ecosystems based on prevailing data infrastructure components. The results of this study should contribute to the improvement of current data ecosystems and build sustainable, transparent, citizen-centered, and socially resilient open data-driven smart cities.
Smarter open government data for society 5.0: Are your open data smart enough? in which, considering the fact that the open (government) data initiative as well as users’ intent for open (government) data are changing continuously and today, in line with IoT and smart city trends, real-time data and sensor-generated data have higher interest for users that are considered to be one of the crucial drivers for the sustainable economy, and might have an impact on ICT innovation and become a creativity bridge in developing a new ecosystem in Industry 4.0 and Society 5.0, the paper examines 51 OGD portals on the presence of the relevant data and their suitability for further reuse, by analyzing their machine-readability, currency or frequency of updates, the ability to submit request/comment/complaint/suggestion and their visibility to other users, and the ability to assess the value of these data assessed by others, i.e., rating, reuse, comments, etc., which is usually considered to be a very time-consuming and complex task, and therefore rarely conducted. The analysis leads to the conclusion that although many OGD portals and data publishers are working hard to make open data a useful tool moving towards Industry 4.0 and Society 5.0, many portals do not even respect the principles of open data, such as machine-readability. Moreover, according to the lists of most competitive countries by topic, there are no leaders who provide their users with excellent data and service, therefore there is room for improvements for all portals. The paper shows that open data, particularly those published and updated in time, are provided in machine-readable format and support to their users, attract audience interest and are used to develop solutions that benefit the entire society (the case in France, Spain, Cyprus, the Netherlands, Taiwan, Austria, Switzerland, etc.). Thus, the publication of open data should be done not only because it is a modern trend, but also because it incentivizes scientists, researchers and enthusiasts to reuse the data by transforming it into knowledge and value, providing solutions, improving the world, and moving towards Society 5.0 or the super smart society.
Definition and evaluation of data quality: User-oriented data object-driven approach to data quality assessmentproposes a data object-driven approach to data quality evaluation. This user-oriented solution is based on 3 main components: data object, data quality specification and the process of data quality measuring. These components are defined by 3 graphical DSLs, that are easy enough even for non-IT experts. The approach ensures data quality analysis depending on the use-case. Developed approach allows analysing quality of “third-party” data. The proposed solution is applied to open data sets. The result of approbation of the proposed approach demonstrated that open data have numerous data quality issues. There are also underlined common data quality problems detected not only in Latvian open data but also in open data of 3 European countries – Estonia, Norway, the United Kingdom. I.e., none of the very simple or intuitive and even obvious use cases in which the values of the primary parameters were analysed were satisfied by any Company Register. However, the Estonian and Norwegian Registers can be used to identify any company by its name and registration number, since only they have passed quality checks of the relevant fields.
Open Data Hackathon as a Tool for Increased Engagement of Generation Z: To Hack or Not to Hack? examines the role of open data hackathons, known as a form of civic innovation in which participants representing citizens can point out existing problems or social needs and propose a solution, in OGD initiative. Given the high social, technical, and economic potential of open government data (OGD), the concept of open data hackathons is becoming popular around the world. This concept has become popular in Latvia with the annual hackathons organised for a specific cluster of citizens – Generation Z. Contrary to the general opinion, the organizer suggests that the main goal of open data hackathons to raise an awareness of OGD has been achieved, and there has been a debate about the need to continue them. This study presents the latest findings on the role of open data hackathons and the benefits that they can bring to both the society, participants, and government. First, a systematic literature review is carried out to establish a knowledge base. Then, empirical research of 4 case studies of open data hackathons for Generation Z participants held between 2018 and 2021 in Latvia is conducted to understand which ideas dominated and what were the main results of these events for the OGD initiative. It demonstrates that, despite the widespread belief that young people are indifferent to current societal and natural problems, the ideas developed correspond to current situation and are aimed at solving them, revealing aspects for improvement in both the provision of data, infrastructure, culture, and government- related areas.
More to come, and let’s keep track of updates in this repository! Do not also to check other works in both the repository, as well as more work of mine you can find here.
This April the next edition of the Research and Innovation Forum (Rii Forum) on which I posted previously will take place. For those, who are not familiar with Rii Forum yet, it is an annual conference that brings together researchers, academics, and practitioners in conceptually sound inter- and multi-disciplinary, empirically driven debate on key issues influencing the dynamics of social interaction today. Such a wide scope makes it a great event for those who do not want to be limited to a particular area or research question and want to be aware of everything that happens in today’s dynamic and multidisciplinary world. This, in turn, allows you not only to see another perspectives and topics, but also reconsider your topic, revealing something new, i.e. taking a look on it from a different angle, which is exceptionally valuable!
Technology, innovation, and education, as well as issues and topics located at their intersection, define the key dimensions of all discussions held during the Rii Forum. In continuously fragile international and domestic contexts, characterized by shocks, crises, and uncertainty, the Rii Forum 2023 seeks to address the multifaceted question of how to navigate these shocks, crises and uncertainty and deliver value to our society. Thus, the topic of Rii Forum 2023 is “Innovation 5.0.: Navigating shocks and crises in uncertain times Technology – Business – Society” with seven tracks:
TRACK 1: Education in times of shocks, crises and uncertainty
TRACK 2: Smart cities and communities
TRACK 3: Big data, business and society: Managing the distributed risks and opportunities
TRACK 4: Management: Rethinking management in times of profound change
TRACK 5: Innovation, entrepreneurship, and innovation management in the era of Industry 5.0.
TRACK 6: ICT and the medicine and healthcare cluster
TRACK 7: Data-driven approaches & human resource management in the era of digitalization
As part of Rii Forum 2023 a plenary debate “Advances in ICT & the Society: threading the thin line between progress, development and mental health” will take place, where I was honored to be invited as one of four plenary speakers, particularly considering that according tot he invitation, the organizers see me as the person whose “expertise and your contribution to the academic debate make you one of the trendsetters in current debate on open data and data quality management”, as well as leading voice and influencer. The other three panel discussants are Prof. Dr. Marek Krzystanek, Karolina Laurentowska & Prof. Marek Pawlicki. Hope this will be an interactive, fruitful and productive discussion with further involvement of the audience!
📣📣📣 This is a short post to let you know that we have reached the most important phase of the consultation process for the European Digital Skills Certificate (#EDSC) feasibility study, and on behalf of EDSC – as EDSC Ambassador – I sincerely invite Public authorities, Policy makers, Education & Training providers working in the fields of education, digitalisation, and employment, and to education and training providers at the national, regional and/or local level to participate in this last survey.
🎯🎯🎯This survey aims to get an overview of the potential demand for an EDSC, the needs and gaps in the current digital skills certification ecosystem, the expectedvalue and potential benefits of an EDSC, and of the key requirements foran EDSC.
The European Commission’s Directorate-General for Employment, Social Affairs and Inclusion has initiated Action 9 of the Digital Education Action Plan (DEAP 2021-2017), that is, to develop a European Digital Skills Certificate (EDSC) that may be recognised and accepted by governments, employers, and other stakeholders across Europe. This would allow Europeans to indicate their level of digital competences, corresponding to the Digital Competence Framework (DigComp) proficiency levels.
In order to participate, you have to first register. If you are not yet registered, please fill in the form available at the link: https://edsc-consultation.eu/register/
The survey takes around 30 minutes, and it will be open until the 7th of April 2023.
Thank you in advance for taking the time to complete the questionnaire!