Our paper establishes an understanding into how OGD enables and drives the co-creation of new services and, in the process, creates public value within the context of a crisis – specifically the COVID-19 pandemic. The research started from an observation that in previous crises (such as floods and forest fires) OGD had seen use and that, during the COVID-19 pandemic, OGD was being released in large amounts and these data were being reused consistently. During the earliest stages of the pandemic some research was conducted that explored the adoption and implementation of data-driven innovations and services during the COVID-19 pandemic, but this research often focused only on new services, datasets released, or the potential value that has been created from these services. Research that explored the role that OGD specifically played in the crisis management process was missing.
While there are several different understandings of crisis – a political scandal or a natural disaster – this paper is concerned primarily with crises that have a potential to directly and negatively impact large number of individuals, i.e., natural disasters, conflict, or health related pandemics, epidemics, or endemics such as COVID-19. The common factors for these crises is a large divergence from normal operating conditions, risk to wellbeing, and the need manage them quickly and, often, with inadequate information. In such a situation, OGD may play an important role by aiding in the creation, collection, and dissemination of information, thus creating new opportunities for service co-creation and public value generation.
To generate new insight into how OGD was used during the COVID-19 pandemic as a crisis management tool a comparative exploratory case study was conducted, focusing on three Central and East European (CEE) countries: Czech Republic, Estonia, and Latvia, which are united by a shared history, traditionally lower levels of administrative capacity, and the relative newness of their public administrations. Within this context, technology is often embraced to subsidize missing capacities, unsurprisingly, many CEE countries are viewed as leaders when it comes to the use of digital technologies by the public sector. This provides an ideal context for exploring how OGD could be used during a crisis – states that are already used to using digital technologies in their administrations, often to overcome strategic shortages or skill gaps in their administrations, which may provide incentives for administrations to engage in co-creation. With the cases selected, two primary research questions were formulated and used to drive the study:
RQ1: What is the role of OGD in crisis management?
RQ2: How does OGD influence the co-creation of services that assist in crisis management?
In conducting the research to answer these questions, three clear contributions emerged.
💡 First, this paper provides new insight into the role that OGD may play during times of crisis and offers initial conceptual propositions about the role of OGD in the crisis management process.
💡 Second, new empirical exploration into how OGD drives the co-creation of new services is provided.
💡 Third, the paper demonstrates empirically the potential that OGD has when it is released and used.
Ultimately, this paper aims to be of use both to practitioners who may be directly involved in the management of a crisis or in the management of an OGD initiative, for academics interested in studying the impacts of OGD or crisis management, and also for citizens to see how they can play an active role in co-creation processes and thereby create value.
Sounds interesting? [hope so] if yes, find the paper here, while a pre-print (in case you cannot access the full-text) is available here.
This April the next edition of the Research and Innovation Forum (Rii Forum) on which I posted previously will take place. For those, who are not familiar with Rii Forum yet, it is an annual conference that brings together researchers, academics, and practitioners in conceptually sound inter- and multi-disciplinary, empirically driven debate on key issues influencing the dynamics of social interaction today. Such a wide scope makes it a great event for those who do not want to be limited to a particular area or research question and want to be aware of everything that happens in today’s dynamic and multidisciplinary world. This, in turn, allows you not only to see another perspectives and topics, but also reconsider your topic, revealing something new, i.e. taking a look on it from a different angle, which is exceptionally valuable!
Technology, innovation, and education, as well as issues and topics located at their intersection, define the key dimensions of all discussions held during the Rii Forum. In continuously fragile international and domestic contexts, characterized by shocks, crises, and uncertainty, the Rii Forum 2023 seeks to address the multifaceted question of how to navigate these shocks, crises and uncertainty and deliver value to our society. Thus, the topic of Rii Forum 2023 is “Innovation 5.0.: Navigating shocks and crises in uncertain times Technology – Business – Society” with seven tracks:
TRACK 1: Education in times of shocks, crises and uncertainty
TRACK 2: Smart cities and communities
TRACK 3: Big data, business and society: Managing the distributed risks and opportunities
TRACK 4: Management: Rethinking management in times of profound change
TRACK 5: Innovation, entrepreneurship, and innovation management in the era of Industry 5.0.
TRACK 6: ICT and the medicine and healthcare cluster
TRACK 7: Data-driven approaches & human resource management in the era of digitalization
As part of Rii Forum 2023 a plenary debate “Advances in ICT & the Society: threading the thin line between progress, development and mental health” will take place, where I was honored to be invited as one of four plenary speakers, particularly considering that according tot he invitation, the organizers see me as the person whose “expertise and your contribution to the academic debate make you one of the trendsetters in current debate on open data and data quality management”, as well as leading voice and influencer. The other three panel discussants are Prof. Dr. Marek Krzystanek, Karolina Laurentowska & Prof. Marek Pawlicki. Hope this will be an interactive, fruitful and productive discussion with further involvement of the audience!
Today, in the age of information and Industry 4.0, billions of data sources, including but not limited to interconnected devices (sensors, monitoring devices) forming Cyber-Physical Systems (CPS) and the Internet of Things (IoT) ecosystem, continuously generate, collect, process, and exchange data1. With the rapid increase in the number of devices (smart objects or “things”, e.g., smartphones, smartwatches, intelligent vehicles etc.) and information systems in use, the amount of data is increasing. Moreover, due to the digitization and variety of data being continuously produced and processed with a reference to Big Data, their value, is also growing and as a result, the risk of security breaches and data leaks, including but not limited to users’ privacy2. The value of data, however, is dependent on several factors, where data quality and data security that can affect the data quality if the data are accessed and corrupted, are the most vital. Data serve as the basis for decision-making, input for models, forecasts, simulations etc., which can be of high strategical and commercial / business value.
This has become even more relevant in terms of COVID-19 pandemic, when in addition to affecting the health, lives, and lifestyle of billions of citizens globally, making it even more digitized, i.e., the digital environment has replaced the physical, thus it has had a significant impact on business3. This is especially the case because of challenges companies have faced in maintaining business continuity in this so-called “new normal”. However, in addition to those cybersecurity threats that are caused by changes directly related to the pandemic and its consequences, many previously known threats have become even more desirable targets for intruders, hackers. Every year millions of personal records become available online4, 5, 6.
Lallie et al. have compiled statistics on the current state of cybersecurity horizon during the pandemic, which clearly indicate a significant increase of such. As an example, Shi reported a 600% increase in phishing attacks in March 2020, just a few months after the start of the pandemic, when some countries were not even affected.
Miles, however, reported that in 2021, there was a record-breaking number of data compromises, where “the number of data compromises was up more than 68% when compared to 2020”, when LinkedIn was the most exploited brand in phishing attacks, followed by DHL, Google, Microsoft, FedEx, WhatsApp, Amazon, Maersk, AliExpress and Apple.
Recent research7,8,9,10,11 demonstrated that weak data and database protection in particular is one of the key security threats. This poses a serious security risk, especially in the light of the popularity of search engines for Internet connected devices, also known as Internet of Things Search Engines (IoTSE), Internet of Everything (IoE) or Open Source Intelligence (OSINT) Search Engines such as Shodan, Censys, ZoomEye, BinaryEdge, Hunter, Greynoise, Shodan, Censys, IoTCrawler. While these tools may represent a security risk, they provide many positive and security-enhancing opportunities. They provide an overview on network security, i.e., devices connected to the Internet within the company, are useful for market research and adapting business strategies, allow to track the growing number of smart devices representing the IoT world, tracking ransomware – the number and nature of devices affected by it, and therefore allow to determine the appropriate actions to protect yourself in the light of current trends. However, almost every of these white hat-oriented objectives can be exploited by black-hatters. The popularity of IoTSE decreased a level of complexity of searching for connected devices on the internet and easy access even for novices due to the widespread popularity of step-by-step guides on how to use IoT search engine to find and gain access if insufficiently protected to webcams, routers, databases and in particular non-relational (NoSQL) databases, and other more «exotic» artifacts such as power plants, wind turbines or refrigerators. They provide service- and country- wised exposure dashboards, TOP vulnerabilities according to CVE, statistics about the authentication status, Heartbleed, BlueKeep – a vulnerability revealed in Microsoft’s Remote Desktop Protocol that has become even more widely used during pandemics, port usage and the number of already compromised databases. Some of these data play a significant role for experienced and skilled attackers, making these activities even less resource-consuming by providing an overview of the ports to be used to increase the likelihood of faster access to the artifact etc.
According to Risk Based Security Monthly Newsletter, 73 million records were exposed in March 2022, and 358 vulnerabilities were identified as having a public exploit that had not yet been provided with CVE IDs. And while 2021 Year End Report Vulnerability by Risk based security & Flashpoint suggests that vulnerability landscape is returning to normal, there is another trigger closely related to cybersecurity that is now affecting the world – geopolitical upheaval.
In the past, vulnerability databases such as CVE Details were considered useful resources for monitoring the security level of a product being used. Indeed, CVE registry refers to several vulnerabilities they divide into 13 types: (1) bypass something, e.g., restriction, (2) cross-site scripting known as XSS, (3) denial of service (DoS), (4) directory traversal, (5) code execution (arbitrary code on vulnerable system), (6) gain privileges, (7) HTTP response splitting, (8) memory corruption, (9) gain / obtain information, (10) overflow, (11) cross site request forgery (CSRF), (12) file inclusion, (13) SQL injection. However, they are static and refer to very common vulnerabilities in the product being registered when a vulnerability is detected. Advances in ICT, including the power of the IoTSE, require the use of more advanced techniques for this purpose.
While security breaches and different security protection mechanisms have been widely covered in the literature, the concept of a “primitive” artifact such as data management system seems to have been more neglected by researchers and practitioners. But are data management systems always protected by default? Previous research and regular updates on data leakages suggest that the number and nature of these vulnerabilities are high. It also refers to little or no DBMS protection, especially in case of NoSQL, which are thus vulnerable to attacks. The aim of this paper is to examine whether “traditional” vulnerability registries provide a sufficiently comprehensive view of DBMS security, or they should be intensively and dynamically inspected by DBMS owners by referring to Internet of Things Search Engines moving towards a sustainable and resilient digitized environment.
The aim of this paper is to examine both current data security research and to analyse whether “traditional” vulnerability registries provide a sufficient insight on DBMS security, or they should be rather inspected by using IoTSE-based and respective passive testing, or dynamically inspected by DBMS holders conducting an active testing. The paper brings attention to this problem and makes the reader think about data security before looking for and introducing more advanced security and protection mechanisms, which, in the absence of the above, may bring no value. As regards the IoTSE tool, this study refers to Shodan- and Binary Edge- based vulnerable open data sources detection tool – ShoBeVODSDT – proposed by Daskevics and Nikiforova (2021).
This study provided a brief insight of the current state of data security provided by CVE Details – the most widely known vulnerability registry, considering 13 databases. Although the idea of CVE Details is appealing, i.e., it supports stakeholder engagement, where each person or organization can submit a report about a detected vulnerability in the product, it is obviously not sufficiently comprehensive. It can be used to monitor the current state of vulnerabilities, but this static approach, which sometimes provides incomplete or inconsistent information even about revealed vulnerabilities, must be complemented by other more dynamic solutions. This includes not only the use of IoTSE-based tools, which, while providing valuable insight into unprotected databases seen or even accessible from outside the organization, are also insufficient.
The paper shows an obvious reality, which, however, is not always visible to the company. In other words, while this may seem surprisingly in light of current advances, the first step that still needs to be taken thinking about date security is to make sure that the database uses the basic security features: authentication, access control, authorization, auditing, data encryption and network security12,13,14. Ignorance or non-awareness can have serious consequences leading to data leakages if these vulnerabilities are exploited. Data security and appropriate database configuration is not only about NoSQL, which is typically considered to be much less secured, but also about RDBMS. This study has shown that RDBMS are also relatively inferior to various types of vulnerabilities. Moreover, there is no “secure by design” database, which is not surprising since absolute security is known to be impossible. However, this does not mean that actions should not be taken to improve it. More precisely, it should be a continuous process consisting of a set of interrelated steps, sometimes referred to as “reveal-prioritize-remediate”. It should be noted that 85% of breaches in 2021 were due to a human factor, with social engineering recognized as the most popular pattern 15. The reason for this is that even in the case of highly developed and mature data and system protection mechanism (e.g., IDS), the human factor remains very difficult to control. Therefore, education and training of system users regarding digital literacy, as well as the definition, implementation and maintaining security policies and risk management strategy, must complement technical advances.
Nikiforova, A., Daskevics, A., & Azeroual, O. (2023). NoSQL Security: Can My Data-driven Decision-making Be Influenced from Outside?. In Big Data and Decision-Making: Applications and Uses in the Public and Private Sector (pp. 59-73). Emerald Publishing Limited.
Daskevics, A., & Nikiforova, A. (2021, November). ShoBeVODSDT: Shodan and Binary Edge based vulnerable open data sources detection tool or what Internet of Things Search Engines know about you. In 2021 Second International Conference on Intelligent Data Science Technologies and Applications (IDSTA) (pp. 38-45). IEEE.
Daskevics, A., & Nikiforova, A. (2021, December). IoTSE-based open database vulnerability inspection in three Baltic countries: ShoBEVODSDT sees you. In 2021 8th International Conference on Internet of Things: Systems, Management and Security (IOTSMS) (pp. 1-8). IEEE.
📣📣📣 This is a short post to let you know that we have reached the most important phase of the consultation process for the European Digital Skills Certificate (#EDSC) feasibility study, and on behalf of EDSC – as EDSC Ambassador – I sincerely invite Public authorities, Policy makers, Education & Training providers working in the fields of education, digitalisation, and employment, and to education and training providers at the national, regional and/or local level to participate in this last survey.
🎯🎯🎯This survey aims to get an overview of the potential demand for an EDSC, the needs and gaps in the current digital skills certification ecosystem, the expectedvalue and potential benefits of an EDSC, and of the key requirements foran EDSC.
The European Commission’s Directorate-General for Employment, Social Affairs and Inclusion has initiated Action 9 of the Digital Education Action Plan (DEAP 2021-2017), that is, to develop a European Digital Skills Certificate (EDSC) that may be recognised and accepted by governments, employers, and other stakeholders across Europe. This would allow Europeans to indicate their level of digital competences, corresponding to the Digital Competence Framework (DigComp) proficiency levels.
This post is dedicated to two very pleasant events for me, namely the international Open Data Day 🎉🍾🥂, and the announcement of the keynote talk that I was kindly invited to deliver at the 5th International Conference on Advanced Research Methods and Analytics (CARMA) organized by Universidad de Sevilla, Cátedra Metropol Parasol, Cátedra Digitalización Empresarial, IBM, Universitat Politècnica de València and 🥁 🥁 🥁 Coca-Cola – what a delicious conference!🍸🍸🍸
CARMA is a forum for researchers and practitioners to exchange ideas and advances on how emerging research methods and sources are applied to different fields of social sciences as well as to discuss current and future challenges with main focus on the topics such as Internet and Big Data sources in economics and social sciences including Social media and public opinion mining, Web scraping, Google Trends and Search Engine data, Geospatial and mobile phone data, Open data and public data, Big Data methods in economics and social sciences such as Sentiment analysis, Internet econometrics, AI and Machine learning applications, Statistical learning, Information quality and assessment, Crowdsourcing, Natural Language processing, Explainability and interpretability, the applications of the above including but not limited to Politics and social media, Sustainability and development, Finance applications, Official statistics, Forecasting and nowcasting, Bibliometrics and sciencetometrics, Social and consumer behaviour, mobility patterns, eWOM and social media marketing, Labor market, Business analytics with social media, Advances in travel, tourism and leisure, Digital management, Marketing Intelligence analytics, Data governance, and Digital transition and global society, which, in turn, expects contributions in relation to Privacy and legal aspects, Electronic Government, Data Economy, Smart Cities, Industry adoption.
And as almost each and every conference, CARMA expects to have keynotes, which are two – Patrick Mikalef, who will talk about Responsible AI and Big Data Analytics, and me, whose keynote talk will be devoted to the topics I studied in recent years titled “Public data ecosystems in and for smart cities: how to make open / Big / smart / geo data ecosystems value-adding for SDG-compliant Smart Living and Society 5.0?” Sounds interesting? (I hope so) Stay tuned to know more! And return back, since I plan to reflect on the content of both talks and the conference in general.
The CARMA 2023 conference will be held on 28 June – 30 June 2023 in the University of Seville.