<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <title>Journal of Computing and Security</title>
    <link>https://jcomsec.ui.ac.ir/</link>
    <description>Journal of Computing and Security</description>
    <atom:link href="" rel="self" type="application/rss+xml"/>
    <language>en</language>
    <sy:updatePeriod>daily</sy:updatePeriod>
    <sy:updateFrequency>1</sy:updateFrequency>
    <pubDate>Mon, 01 Sep 2025 00:00:00 +0330</pubDate>
    <lastBuildDate>Mon, 01 Sep 2025 00:00:00 +0330</lastBuildDate>
    <item>
      <title>A Low-Code Approach for Developing Customizable Teacher Performance Analysis Dashboards for Moodle</title>
      <link>https://jcomsec.ui.ac.ir/article_29685.html</link>
      <description>Nowadays, analyzing teacher performance is crucial to improving instruction quality. Evaluating teachers can be effective in designing and delivering a course, managing the class, managing time, and supporting learners. Continuous teacher evaluation helps maintain high educational standards by ensuring that only qualified educators are recruited. In the current educational landscape, learning management systems (LMS) provide an environment for interaction and quality enhancement. Dashboards designed to analyze teacher performance in educational institutions can significantly aid in generating supervisory reports on course quality and improving teaching effectiveness. Additionally, various evaluation criteria&amp;amp;mdash;which may differ across institutions&amp;amp;mdash;can support these assessments. However, developing performance analysis dashboards for teachers is complex, requiring substantial time and financial investment. Existing research on such dashboards does not fully address the need for diverse dashboard types, highlighting the demand for a new solution.This study employs a low-code development approach to create a customizable platform for teacher performance analysis dashboards in Moodle LMS, tailored to specific evaluation criteria. The platform categorizes teacher performance indicators using a feature model, streamlining dashboard development and reducing time and cost. Its modular architecture enables rapid assembly of dashboard components, while the feature model allows dynamic selection and configuration of relevant metrics. The study was evaluated in two phases. First, three case studies across different subjects were analyzed. Then, usability testing was conducted via an online workshop and questionnaire completed by school administrators and LMS experts. Results demonstrate faster dashboard development and high user satisfaction, confirming the platform&amp;amp;rsquo;s effectiveness.</description>
    </item>
    <item>
      <title>Aspect-Oriented Taxonomies of Requirements Development: A Systematic Review</title>
      <link>https://jcomsec.ui.ac.ir/article_29639.html</link>
      <description>This study focuses on requirements development, a vital phase in the success of software projects, with particular attention to the classification of both functional and non-functional requirements. Despite its crucial role, there has been relatively limited research focused on developing comprehensive taxonomies for requirements development processes. To bridge this gap, this paper conducts a systematic literature review, examining studies published between 2011 and 2024 across six electronic databases. From an initial pool of 1025 studies, 250 were selected for further review based on their relevance to eight research questions. After careful manual scrutiny and application of specific quality metrics, a final set of 90 studies was identified for detailed analysis. The review revealed that most existing taxonomies primarily focus on activities such as requirements elicitation, analysis, and modelling, often overlooking other important phases. Building upon previous work, this paper proposes four new taxonomies that encompass various stages of requirements development, including elicitation, analysis, modelling, specification, verification, and validation. These proposed taxonomies aim to provide a more holistic view of requirements engineering and serve as practical tools for engineers and stakeholders to improve the effectiveness and accuracy of requirements development. By integrating and expanding upon previous taxonomies, these new taxonomies are designed to address neglected aspects and facilitate better decision-making throughout the requirements process, ultimately leading to higher-quality software solutions. The proposed taxonomies support practitioners in identifying better approaches to requirements management and development, contributing to the overall success of software projects through more systematic and structured requirements practices.</description>
    </item>
    <item>
      <title>A Comprehensive Survey on Multi-hop Machine Reading Comprehension Datasets and Metrics</title>
      <link>https://jcomsec.ui.ac.ir/article_29640.html</link>
      <description>Multi-hop Machine reading comprehension is a challenge field in natural language processing and its goal is to find the answer of questions based on disjoint pieces of information in a natural language context. Due to the complexity and importance of this field, a large number of studies have been focused on multi-hop MRC recently. Then it is necessary and worth reviewing them in detail. Since datasets and evaluation metrics play a crucial role in the progress of research in this field, this study aims to present a comprehensive survey on recent advances in multi-hop MRC evaluation metrics and datasets. In this regard, the existing multi-hop MRC evaluation metrics and datasets are reviewed from 2017 to 2025, and a comprehensive analysis has been prepared at the end and. Also, open issues in this field have been discussed.</description>
    </item>
    <item>
      <title>Intrusion Detection In Computer Networks Using A Hybrid CNN-BiLSTM Network</title>
      <link>https://jcomsec.ui.ac.ir/article_30099.html</link>
      <description>Intrusion Detection Systems (IDS) are critical for securing computer networks by identifying and analyzing unauthorized or abnormal activities, thereby preventing cyberattacks. Due to the increasing complexity and volume of network data, there is a growing demand for advanced and efficient data analysis methods. This study proposes a hybrid deep learning model combining Convolutional Neural Networks (CNN) and Bidirectional Long Short-Term Memory (BiLSTM) networks to enhance the accuracy and robustness of intrusion detection. Initially, CNNs are employed to extract spatial features from raw network traffic data. These features are then processed by a BiLSTM network to capture temporal dependencies and contextual relationships. To further improve classification performance, three machine learning classifiers-k-Nearest Neighbors (k-NN), Decision Tree, and Support Vector Machine (SVM) are trained on the extracted features, and their outputs are integrated using a weighted voting ensemble method. The proposed model is evaluated using the NSL-KDD dataset, achieving an accuracy of 99% in binary classification and 99.12\% in multi-class classification. The results demonstrate the effectiveness of the CNN&amp;amp;ndash;BiLSTM hybrid approach in accurately detecting both known and complex attack patterns in network traffic.</description>
    </item>
    <item>
      <title>Improving Drug Response Prediction using Dual Similarity Regularization</title>
      <link>https://jcomsec.ui.ac.ir/article_30188.html</link>
      <description>Personalized medicine aims to identify effective anticancer therapies tailored to individual patients, a core goal of precision oncology. Despite significant advances, achieving reliable and accurate drug response prediction remains challenging due to the complexity and heterogeneity of pharmacogenomic data. Motivated by the principle that similar cell lines exhibit similar responses to similar drugs, we propose an enhanced matrix factorization framework incorporating a novel dual similarity regularization strategy. The proposed Dual Similarity-Regularized Matrix Factorization (DSRMF) model constrains the latent representations of cell lines and drugs to preserve biological and chemical similarity relationships, ensuring that similar entities occupy proximate positions in the latent space while dissimilar ones remain distant. The model integrates two-dimensional (2D) and three-dimensional (3D) chemical structural features to construct a refined drug similarity matrix and was trained and validated on processed datasets from the Genomics of Drug Sensitivity in Cancer (GDSC) and the Cancer Cell Line Encyclopedia (CCLE). Experimental results demonstrate that DSRMF achieves robust predictive performance, with an average Pearson correlation coefficient (PCC) of approximately 0.96 and a root mean square error (RMSE) of 0.30, indicating a strong correlation between predicted and observed drug responses. These findings confirm that incorporating dual similarity regularization and heterogeneous biological information enhances both predictive accuracy and interpretability. Overall, DSRMF advances drug response modeling and provides a scalable framework for integrating multi-dimensional biological data to improve personalized cancer treatment strategies.</description>
    </item>
    <item>
      <title>An ant-colony-optimization-based method for community-aware link prediction in social networks</title>
      <link>https://jcomsec.ui.ac.ir/article_30291.html</link>
      <description>Link prediction is one of the primary issues in the social network analysis domain, focusing on predicting the emergence of future links. Among various approaches, metaheuristic-based methods stand out due to their ability to combine existing heuristic measures related to link formation with their ability of greedy exploration, enhancing the effectiveness of future link prediction. Many of these heuristics are designed based on the structural characteristics of social networks. However, only a limited number of methods have considered the community structure, a fundamental property of social networks. In this article, a novel link prediction method is introduced by integrating the concept of community structure into the ant colony optimization algorithm. Specifically, three common neighbor-based heuristics are extended to consider the community structure and the distinct nature of intra-community and inter-community relationships, which are then utilized in the ant colony optimization search mechanism. The effectiveness of the proposed method is evaluated using several benchmark techniques and nine real-world datasets. The results demonstrate the outperformance of the presented approach considering ACU and precision comparison metrics.</description>
    </item>
    <item>
      <title>Integrating VGG16 with the ACIMD Protocol to Enhance Security and Reliability in ECG-Based Remote Authentication Systems</title>
      <link>https://jcomsec.ui.ac.ir/article_30292.html</link>
      <description>The security of Implantable Medical Devices (IMDs) is of paramount importance, as unauthorized access can lead to life-threatening consequences. Electrocardiogram (ECG) signals present a promising biometric modality for authentication due to their inherent uniqueness and liveness detection capability. However, wireless ECG-based systems are vulnerable to relay attacks, necessitating robust proximity verification. This study proposes a novel, integrated authentication framework that addresses both user identification and physical proximity. We enhance the established Access Control for Implantable Medical Devices (ACIMD) distance-bounding protocol by incorporating a fine-tuned VGG16 deep convolutional neural network for high-accuracy ECG biometric verification. The system was rigorously evaluated using the MIT-BIH Arrhythmia Database. The integrated model achieved an overall authentication accuracy of 99.45%, surpassing the baseline ACIMD protocol&amp;amp;#039;s accuracy of 97.82%. This represents an average improvement of 1.63% across various physical distance thresholds. While the integration of VGG16 increased the total authentication time from 0.0113 seconds to 0.0437 seconds, this remains well within acceptable limits for real-time medical applications. Crucially, we provide new evidence for improved system reliability, demonstrating superior robustness against signal noise compared to the baseline. The proposed system effectively balances high biometric fidelity with stringent physical-layer security, offering a comprehensive solution for secure remote authentication in critical applications like IMDs.</description>
    </item>
    <item>
      <title>A Q-Learning Approach for Dynamic Resource Management in Three-Tier Vehicular Fog Computing</title>
      <link>https://jcomsec.ui.ac.ir/article_30294.html</link>
      <description>In this paper, a method for predicting the resources required for an intelligent vehicle client using a three-layer vehicular computing architecture is proposed. This method leverages Q-Learning to optimize resource allocation and enhance overall system performance.  This approach employs reinforcement learning capabilities to provide a dynamic and adaptive strategy for resource management in a fog computing environment. The key findings of this study indicate that Q-learning can effectively predict the appropriate allocation of resources by learning from past experiences and making informed decisions. Through continuous training and updating of the Q-learning agent, the system can adapt to changing conditions and make resource allocation decisions based on real-time information. The experimental results demonstrate the effectiveness of the proposed method in optimizing resource allocation. The Q-learning agent predicts the optimal values for memory, bandwidth, and processor. These predictions not only minimize resource consumption but also meet the performance requirements of the fog system. Implementations show that this method improves the average task processing time in compared to other methods evaluated in this study.</description>
    </item>
  </channel>
</rss>
