CFP: The International Conference on Intelligent Data Science Technologies and Applications (IDSTA2023)

On behalf of the organizers and as a publicity chair, I sincerely invite you to consider submitting the results of your recent research to The International Conference on Intelligent Data Science Technologies and Applications (IDSTA2023), which will be held in conjunction Kuwait Fintech and Blockchain Summit.

Huge amount of data is being generated and transmitted everyday. To be able to deal with this data, extract useful information from it, store it, transmit it, and represent it, intelligent technologies and applications are needed. The International Conference on Intelligent Data Science Technologies and Applications (IDSTA) is a peer reviewed conference, whose objective is to advance the Data Science field by giving an opportunity for researchers, engineers, and practitioners to present their latest findings in the field. It will also invite key persons in the field to share their current knowledge and their future expectations for the field. Topics of interest for submission include, but are not limited to:

💡Applied Public Affairs, incl. but not limited to Campaign Management, Mass Communication Politics, Political Analysis, Survey Sampling
💡Business Analytics, incl. but not limited to Stock Market Analysis, Predictive Analytics, Business Intelligence
💡Finance, incl. but not limited to Risk Management, Algorithmic Trading, Fraud Detection, Financial Analysis
💡Computer Science, incl. but not limited to Database Management Systems, Scientific Computing, Computer Vision, Fuzzy Computing, Feature Selection, Neural Networks, Deep Learning, Meta-Learning, Process Mining, Artificial Intelligence, Data Mining, Big Data, Web Analytics, Text Mining, Natural Language Processing, Sentiment Analysis, Social Media Analysis, Data Fusion, Performance Analysis and Evaluation, Evolutionary Computing and Optimization, Hybrid Methods, Granular Computing, Recommender Systems, Data Visualization, Predictive Maintenance, Internet of Things (IoT), Web Scraping
💡Sustainability, incl. but not limited to Datasets on Sustainability, Sustainability Modeling, Energy Sustainability, Water Sustainability, Environmental Sustainability, Risk Analysis
💡Cybersecurity, incl. but not limited to Data Privacy and Security, Network Security, Communication Security, Cryptography, Fraud Detection, Blockchain
💡Environmental Science, incl. but not limited to GIS, Climatographic, Remote Sensing, Spatial Data Analysis, Weather Prediction and Tracking,
💡Biotechnologies, incl. but not limited to Gnome Analysis, Drug Discovery and Screening and Side Effect Analysis, Structural and Folding Pattern, Disease Discovery and Classification, Bioinformatics, Next-Gen Sequencing
💡Smart City, incl. but not limited to City Data Management, Smart Traffic, Surveillance, Location-Based Services, Robotics
💡Human Behaviour Understanding
💡Semi-Structured and Unstructured Data
💡Pattern Recognition
💡Transparency in Research Data
💡Data and Information Quality
💡GPU Computing
💡Crowdsourcing


🗓️🗓️🗓️ IMPORTANT DATES

  • Paper submission:  March 15, 2023  
  • Acceptance notification:  May 20th, 2023
  • Full paper camera-ready submission: October 1st, 2023
    Conference Dates: October 24-26, 2023

All papers that are accepted, registered, and presented in IDSTA2023 and the workshops co-located with it will be submitted to IEEEXplore for possible publication. 
For any inquiries, contact intelligenttechorg@gmail.com.

Submit the paper and meet our team in Kuwait in October, 2023!
 

With best wishes,

IDSTA2023 organizers

Towards data quality by design – ISO/IEC 25012-based methodology for managing DQ requirements in the development of IS – one of the most downloaded article of DKE from July 2023

It is obvious that users should trust the data that are managed by software applications constituting the Information Systems (IS). This means that organizations should ensure an appropriate level of quality of the data they manage in their IS. Therefore, the requirement for the adequate level of quality of data to be managed by IS must be an essential requirement for every organization. Many advances have been done in recent years in software quality management both at the process and product level. This is also supported by the fact that a number of global standards have been developed and involved, addressing some specific issues, using quality models such as (ISO 25000, ISO 9126), those related to process maturity models (ISO 15504, CMMI), and standards focused mainly on software verification and validation (ISO 12207, IEEE 1028, etc.). These standards have been considered in worldwide for over 15 years.

However, awareness of software quality depends on other variables, such as the quality of information and data managed by application. This is recognized by SQUARE standards (ISO/IEC 25000), which highlight the need to deal with data quality as part of the assessment of the quality level of the software product, according to which “the target computer system also includes computer hardware, non-target software products, non-target data, and the target data, which is the subject of the data quality model”. This means that organizations should take into account data quality concerns when developing various software, as data is a key factor. To this end, we stress that such data quality concerns should be considered at the initial stages of software development, attending the “data quality by design” principle (with the reference to the “quality by design” considered relatively often with significantly more limited interest (if any) to “data quality” as a subset of the “quality” concept when referring to data / information artifacts).

The “data quality” concept is considered to be multidimensional and largely context dependent. For this reason, the management of specific requirements is a difficult task. Thus, the main objective of our new paper titled “ISO/IEC 25012-based methodology for managing data quality requirements in the development of information systems: Towards data quality by design” is to present a methodology for Project Management of Data Quality Requirements Specification called DAQUAVORD aimed at eliciting DQ requirements arising from different users’ viewpoints. These specific requirements should serve as typical requirements, both functional and non-functional, at the time of the development of IS that takes Data Quality into account by default leading to smarter and collaborative development.

In a bit more detail, we introduce the concept of Data Quality Software Requirement as a method to implement a Data Quality Requirement in an application. Data Quality Software Requirement is described as a software requirement aimed at satisfying a Data Quality Requirement. The justification for this concept lies in the fact that we want to capture the Data Quality Software Requirements that best match the data used by a user in each usage scenario, and later, originate the consequent Data Quality Software Requirements that will complement the normal software requirements linked to each of those scenarios. Addressing multiple Data Quality Software Requirements is indisputably a complex process, taking into account the existence of strong dependencies such as internal constraints and interaction with external systems, and the diversity of users. As a result, they tend to impact and show the consequences of contradictory overlaps on both process and data models.

In terms of such complexity and attempting to improve the developing efforts, we introduce DAQUAVORD, a Methodology for Project Management of Data Quality Requirements Specification, which is based on the Viewpoint-Oriented Requirements Definition (VORD) method, and the latest and most generally accepted ISO/IEC 25012 standard. It is universal and easily adaptable to different information systems in terms of both their nature, number and variety of actors and other aspects. The paper proposes both the concept of the proposed methodology and an example of its application, which is a kind of manual step-by-step guidance on how to use it to achieve smarter software development with data quality by design. This paper is a continuation of our previous study. This paper establishes the following research questions (RQs):

RQ1: What is the state of the art regarding the “data quality by design” principle in the area of software development? What are (if any) current approaches to data quality management during the development of IS?

RQ2: How the concepts of the Data Quality Requirements (DQR) and the Viewpoint-Oriented Requirements Definition (VORD) method should be defined and implemented in order to promote the “data quality by design” principle?

Sounds interesting? Read the full-text of the article published in Elsevier Data & Knowledge Engineering – here.

The first comprehensive approach to this problematic is presented in this paper, setting out the methodology for project management of the specification for data quality requirements. Given the relative nature of the concept of “data quality” and active discussions on the universal view on the data quality dimensions, we have based our proposal on the latest and most generally accepted ISO/IEC 25012 standard, thus seeking to achieve a better integration of this methodology with existing documentation and systems or projects existing in the organization. We suppose that this methodology will help Information System developers to plan and execute a proper elicitation and specification of specific data quality requirements expressed by different roles (viewpoints) that interact with the application. This can be assumed as a guide that analysts can obey when writing a Requirements Specification Document supplemented with Data Quality management. The identification and classification of data quality requirements at the initial stage makes it easier to developers to be aware of the quality of data to be implemented for each function during all development process of the application.

As future work thinking, we plan to consider the advantages provided by the Model Driven Architecture (MDA), focusing mainly on its capabilities of both abstraction and modelling characteristics. It will be much easier to integrate our results into the development of “Data Quality aware Information Systems” (DQ-aware-IS) with other software development methodologies and tools. This, however, is expected to expand the scope of the developed methodology and consider various feature related to data quality, including the development of a conceptual measure of data value, i.e., intrinsic value, as proposed in.

UPDATE: In July 2023 it also became one of the most downloaded articles from Data & Knowledge Engineering (Elsevier) in the last 90 days – have not read it yet? take a look, it is waiting for your reading 😉

César Guerra-García, Anastasija Nikiforova, Samantha Jiménez, Héctor G. Perez-Gonzalez, Marco Ramírez-Torres, Luis Ontañon-García, ISO/IEC 25012-based methodology for managing data quality requirements in the development of information systems: Towards Data Quality by Design, Data & Knowledge Engineering, 2023, 102152, ISSN 0169-023X, https://doi.org/10.1016/j.datak.2023.102152

Our – EOSC TF “FAIR Metrics and Data Quality” paper “Towards a data quality framework for EOSC” is released!🍷🍷🍷

I am glad to announce the release of “Towards a data quality framework for EOSC” document, which we have been hard at work on hard for several months as the Data Quality subgroup of the “FAIR Metrics and Data Quality” Task Force European Open ScienceCloud (EOSC) Association) – Carlo Lacagnina, Romain David, Anastasija Nikiforova, Mari Elisa Kuusniemi, Cinzia Cappiello, Oliver Biehlmaier, Louise Wright, Chris Schubert, Andrea Bertino, Hannes Thiemann, Richard Dennis.

This document explains basic concepts to build a solid basis for a mutual understanding of data quality in a multidisciplinary environment such as EOSC. These range from the difference between quality control, assurance, and management to categories of quality dimensions, as well as typical approaches and workflows to curate and disseminate dataset quality information, minimum requirements, indicators, certification, and vocabulary. These concepts are explored considering the importance of evaluating resources carefully when deciding the sophistication of the quality assessments. Human resources, technology capabilities, and capacity-building plans constrain the design of sustainable solutions. Distilling the knowledge accumulated in this Task Force, we extracted cross-domain commonalities (each TF member brings his / her own experience and knowledge – we all represent different domains and therefore try to make our contributions domain-agnostic, but at the same time considering every nuance that our specialism can bring and what deserves to be heard by others), as well as lessons learned, and challenges.

The resulting main recommendations are:

  1. Data quality assessment needs standards; unfortunately, not all communities have agreed on standards, so EOSC should assist and push each community to agree on community standards to guarantee the FAIR exchange of research data. Although we extracted a few examples highlighting this gap, the current situation requires a more detailed and systematic evaluation in each community. Establishing a quality management function can help in this direction because the process can identify which standard already in use by some initiatives can be enforced as a general requirement for that community. We recommend that EOSC considers taking the opportunity to encourage communities to reach a consensus in using their standards.
  2. Data in EOSC need to be served with enough information for the user to understand how to read and correctly interpret the dataset, what restrictions are in place to use it, and what processes participate in its production. EOSC should ensure that the dataset is structured and documented in a way that can be (re)used and understood. Quality assessments in EOSC should not be concerned with checking the soundness of the data content. Aspects like uncertainty are also important to properly (re)use a dataset. Still, these aspects must be evaluated outside the EOSC ecosystem, which only checks that evidence about data content assessments is available. Following stakeholders’ expectations, we recommend that EOSC is equipped with essential data quality management, i.e., it should perform tasks like controlling the availability of basic metadata and documentation and performing basic metadata compliance checks. The EOSC quality management should not change data but point to deficiencies that the data provider or producer can address.
  3. Errors found by the curators or users need to be rectified by the data producer/provider. If not possible, errors need to be documented. Improving data quality as close to the source (i.e., producer or provider) as possible is highly recommended. Quality assessments conducted in EOSC should be shown first to the data provider to give a chance to improve the data and then to the users.
  4. User engagement is necessary to understand the user requirements (needs, expectations, etc.); it may or may not be part of a quality management function. Determining and evaluating stakeholder needs is not a one-time requirement but a continuous and collaborative part of the service delivery process.
  5. It is recommended to develop a proof-of-concept quality function performing basic quality assessments tailored to the EOSC needs (e.g., data reliability and usability). These assessments can also support rewarding research teams most committed to providing FAIR datasets. The proof-of-concept function cannot be a theoretical conceptualization of what is preferable in terms of quality. Still, it must be constrained by the reality of dealing with an enormous amount of data within a reasonable time and workforce.
  6. Data quality is a concern for all stakeholders, detailed further in this document. The quality assessments must be a multi-actor process between the data provider, EOSC, and users, potentially extended to other actors in the long run. The resulting content of quality assessments should be captured in structured, human- and machine-readable, and standard-based formats. Dataset information must be easily comparable across similar products, which calls for providing homogeneous quality information.
  7. A number of requirements valid for all datasets in EOSC (and beyond) and specific aspects of a maturity matrix gauging the maturity of a community when dealing with quality have been defined. Further refinement will be necessary for the future, and specific standards to follow will need to be identified.

We sincerely invite you to take a look at this very concise 76-pages long overview of the topic and look forward to your recommendations / suggestions / feedback – we hope to provide you with the opportunity to communicate the above conveniently very soon, so take your time to read, while we are making our last preparations 📖 🍷📖🍷📖🍷 But make sure you have a glass of wine at the time of reading it, as this will make sense at some point of reading, i.e. when we compare data quality with wine quality with reference to both flavour type and intensity (intrinsic quality), brand, packaging (extrinsic quality)… but no more teasers and bon appetite! 🍷🍷🍷
The document can be found in an Open Access here.

We also want to acknowledge the contribution and input of colleagues from several European institutions, the EOSC Association and several external-to-TF stakeholders who gave feedback based on their own experience, and the TF Support Officer Paola Ronzino, as well as to our colleagues – Sarah Stryeck and Raed Al-Zoubi, and the last but not the list – to all respondents and everyone involved.

Call for Papers: Emerging Data- and Policy-driven Approaches for African Cities Challenges, Data & Policy, Cambridge University Press

On behalf of Guest Editors I sincerely invite you to consider submitting your work to our Special Issue ”Emerging Data- and Policy-driven Approaches for African Cities Challenges” as part of the open-access journal Data & Policy at Cambridge University Press.

This Special Issue aims to expand the reach and scope of urban data research, innovation and entrepreneurship activities and policies to address urban challenges in Africa through the digitisation of cities. It will compile recent expert work on the topic to advance and promote scientific advance / excellence, promote the digital transition and its benefits for creating, collecting, storing and using urban data to achieve sustainable development goals (SDG) in African cities.

African cities and their local actors and managers have been at the forefront of the digital transformation for several years now (Oke et al., 2020). Several urban projects across the continent, from north to south and east to west, are claiming to use the term “smart city” (Söderström et al., 2021). This apparently attractive name is often associated with an “isolationist” technical vision that is provided and marketed by operators with a very western and global vision. Digital and smart city projects are often implemented with citizens and the local ecosystem managed step by step by the municipalities, the digital transition can be primarily aimed at a “smart city of general interest”. In developing countries, and especially in Africa, where the young, female and urban population is becoming increasingly connected, the adoption of digital technologies is exponential and tends to occur without public intervention, including but not limited due to “datafication of cities” (Bibri & Krogstie, 2020; Plantinga, 2022; Oksman & Raunio, 2018). As a result, there is a risk that local authorities will “drop out” of the market, which may manifest itself in the development of alternative digital services by third parties that disrupt or compete with local public services. Another risk is that the local authority may have only limited or incomplete access to data produced by users and businesses within its territory, depriving it of the necessary material for its action. Local authorities in Africa, as in the North, are in a learning phase in their smart city or digital city policies and, in particular, policies regarding data collection / acquisition, storage and use to solve urban challenges (Plantinga, 2022; Oksman & Raunio, 2018).

Indeed, data is one of the essential pillars of an emerging smart or digital city that is best used to support decision making in urban planning and management to address the challenges of cities in Africa.  Therefore, it will be appropriate for this to cover all topics related to digital cities in Africa, including urban data and policy for urban planning applications, African smart city, Smart geoinformation systems (Smart GIS), smart governance, challenges of digital cities in Africa, urban sustainability, planning/management issues of emerging cities in Africa, urban socio-economic challenges (education, health, employment, youth, economy, food security, etc.), urban environment, information and communication technologies applied to the city. 

In addition to its thematic focus, it aims to advance interdisciplinary research by bridging the disciplinary divide between different academic cultures of the humanities, sciences, and application-oriented research, as well as the sectoral divide between urban development actors in Africa. Thus, this special issue will update and strengthen the existing literature on African cities through the results of scientific research based on qualitative and quantitative analysis techniques and methods on topics including, but not limited to data- and policy-driven approaches to address the challenges of African cities and mainly those related to:

💡Water and energy management;
💡Smart waste management and sanitation;
💡Digital management of education and health;
💡Digital mobility and transport management;
💡Quality of Life and social classes;
💡Strategies for digital and smart cities in Africa;
💡Digital and Smart African city stakeholders;
💡Digital and Smart city infrastructure;
💡Artificial intelligence and applications;
💡Digital governance for smart cities;
💡Citizen participation and engagement;
💡Datafication of smart cities;
💡Collective sensing & spatial big urban data;
💡Smart geo-addressing and participatory addressing;
💡Digital transformation and smart Governance;
💡Citizen and Collaborative Governance;
💡Climate and pollution. Environmental monitoring;
💡Disaster risks;
💡Urban Health


Papers to be submitted when ready, with final deadline: January 8, 2024.

Data & Policy publishes the following article types. Authors should consider which is the most appropriate category for their work before they submit:

  • Research articles: original work that uses rigorous methods to investigate how data science can inform or impact policy.
  • Commentaries: shorter articles that discuss and/or problematize an issue relevant to the special issue topic. (Approx 4,000 words in length).
  • Translational articles: focus on the policy setting or environment in which data science principles and approaches are being applied, with the aim of improving the transfer of knowledge from research to practice (and vice versa).
  • Data papers: provide structured descriptions of a data set relevant to the special issue. The data paper should describe the study design and methods that generated the data, but the focus should be to help others re-use the data rather than presenting new findings.

Guest Editors:

  • Jérôme Chenal, CEAT, EPFL, Lausanne, Switzerland
  • Stéphane C. K. Tekouabou, Center of Urban Systems (CUS), UM6P, Benguérir, Morocco
  • El Arbi Allaoui Abdellaoui, ENS, Mouley Ismail University, Meknès, Morocco
  • Anastasija Nikiforova, University of Tartu, Tartu, Estonia

References:

  • Bibri, S. E., & Krogstie, J. (2020). The emerging data–driven Smart City and its innovative applied solutions for sustainability: The cases of London and Barcelona. Energy Informatics, 3, 1-42.
  • Oke, A. E., Aghimien, D. O., Aigbavboa, C. O., & Akinradewo, O. I. (2020). Appraisal of the drivers of smart city development in South Africa. Construction Economics and Building, 20(2), 109-126.
  • Oksman, V., & Raunio, M. (2018, March). Citizen-centric smart city planning for africa: a qualitative case study of early stage co-creation of a Namibian smart community. In The twelfth international conference on digital society and egovernments (pp. 30-35).
  • Söderström, O., Blake, E., & Odendaal, N. (2021). More-than-local, more-than-mobile: The smart city effect in South Africa. Geoforum, 122, 103-117.
  • Plantinga, P. (2022). Digital discretion and public administration in Africa: Implications for the use of artificial intelligence. Information Development, 02666669221117526.

“Emerging issues and innovations” track as part of IFIP EGOV-CeDEM-EPART 2023 is open for submissions!

On behalf of the co-chair of “Emerging issues and innovations” track I sincerely invite everyone whose research focuses on new topics emerging in the field of ICT and public sector, including public-private ecosystems, to submit their work to this track, which is part of EGOV2023 – IFIP EGOV-CeDEM-EPART – one of the most recognized conference in e-Government, ICT and public administration and related topics!

The annual IFIP EGOV2023 will be hosted 5-7 September 2023 in Budapest by the Corvinus University of Budapest, Hungary. The conference focuses on e-Government, Digital Government, Open Government, Smart Government, GovTech, eParticipation and e-Democracy, and related topics like social media, digital transformation, Digital society, artificial intelligence, policy information, policy informatics, smart cities, and social innovation. Several types of submissions are possible, including completed research, ongoing research, reflections & viewpoints, posters, and workshops. The conference consists of 10 tracks:

  • General E-Government and E-Governance Track
  • General e-Democracy & e-Participation track
  • ICT and Sustainable Development Goals Track
  • Digital Society Track
  • AI, Data Analytics & Automated Decision Making Track
  • Smart Cities (Government, Districts, Communities & Regions) Track
  • Open data: social and technical aspects Track
  • Emerging Issues and Innovations Track
  • Digital and Social Media Track
  • Legal Informatics

And while the conference consists of 10 tracks you will definitely find relevant, my personal recommendation is “Emerging issues and innovations” track (chairs: Marijn Janssen, Anastasija Nikiforova, Dr. Csaba Csaki, Francesco Mureddu).


🎯🎯🎯 “Emerging issues and innovations” track focuses on new topics emerging in the field of ICT and public sector, including public-private ecosystems. Topics of interest include but are not limited to:
💡Looking ahead into Social innovation
💡The future of government, policy making and democracy
💡Global challenges that go beyond nation states (such as migration, climate change etc.) and require international collaboration of individual governments
💡Digital transformation in public sector context
💡The future of digital governance
💡Public values in transforming the government
💡The role of government in eCities and sustainable living
💡The role of the public sector in Human Centered Society
💡Self Service Structures for Inclusion
💡Public-private sector collaboration and integration;
💡Decentralized Autonomous Organizations (DAO), smart contracts and blockchain
💡Preparing for the policy challenges of future technologies;
💡Regulating misinformation
💡New technologies for automated decision-making
💡The future public sector use and regulation of latest AI solutions;
💡Public use as well as regulations of industry 4.0 and the internet of things
💡The relationships of governments and Fintech
💡Upcoming issues of eVoting including application of digital signatures in the public sector
💡Online public community building
💡Utilization of digital billboards
💡Latest trends in co-creation and service delivery
💡Forward looking insights from case studies – let it be successful or failed experiments.
 

🗓️🗓️🗓️ IMPORTANT DATES
Deadline for submissions: 31 March 2023
PhD Colloquium deadline for submissions: 1 May 2023
Poster submission deadline: 20 May 2023
PhD Colloquium: 4 September 2023
Conference: 5-7 September 2023

Do not miss 3 days of discussions around e-Government, Digital Government, Open Government, Smart Government, GovTech, eParticipation and e-Democracy, and related topics like social media, digital transformation, Digital society, artificial intelligence, policy information, policy informatics, smart cities, and social innovation. Mark your calendar – 31 March 2023 for submitting your paper, and 5-7 September 2023 for attending the conference!

The conference is organized by the IFIP 8.5 Working group (WG8.5) and the Digital Government Society (DGS). The aim of WG 8.5 is to improve the quality of e-government information systems at international, national, regional and local levels. The WG8.5 emphasis is on interdisciplinary approaches for information systems in public administration. DGS is a global, multi-disciplinary organization of scholars and practitioners interested in the development and impacts of digital government. Read more here.