MEASURING AND COMPUTING DEVICES IN TECHNOLOGICAL PROCESSES
https://vottp.khmnu.edu.ua/index.php/vottp
<p><strong>ISSN </strong>2219-9365</p> <p><strong>Published</strong> since May 1997</p> <p><strong>Publisher:</strong> Khmelnytskyi National University (Ukraine)</p> <p><strong>Frequency:</strong> 4 times a year</p> <p><strong>Manuscript languages:</strong> mixed languages: Ukrainian, English, Polish</p> <p><strong>Editors:</strong> Valeriy Martyniuk (Khmelnytsky, Ukraine)</p> <p><strong>Certificate of state registration of print media:</strong> Series KB № 24923-14863 ПР (12.07.2021).</p> <p><strong>Registration: </strong>The journal is included in Category B of the List of scientific professional publications of Ukraine, in which the results of dissertations for obtaining scientific degrees of doctor and candidate of sciences (specialties: 121, 122, 123, 125, 126, 151, 152, 172) can be published. 1643 28.12.2019 Order of the Ministry of Education and Science of Ukraine of December 28, 2019 No. 1643.</p> <p><strong>License Terms:</strong> Authors retain the copyright and grant the journal the right of first publication along with a work that is simultaneously licensed under a Creative Commons Attribution International CC-BY license, allowing others to share work with proof of authorship and initial publication in that journal.</p> <p><strong>Open Access Statement:</strong> "MEASURING AND COMPUTING DEVICES IN TECHNOLOGICAL PROCESSES" provides immediate open access to its content on the principle that providing free access to research for the public supports a greater global exchange of knowledge. Full-text access to the scientific articles of the journal is presented on the official website in the Archives section.</p> <p><strong>Address:</strong> Scientific journal "MEASURING AND COMPUTING DEVICES IN TECHNOLOGICAL PROCESSES", Khmelnytsky National University, st. 11, Khmelnytsky, 29016, Ukraine.</p> <p><strong>Tel .:</strong> +380673817986</p> <p><strong>е-mail:</strong> vottp@khmnu.edu.ua</p> <p><strong>web-site:</strong> <span class="VIiyi" lang="uk"><span class="JLqJ4b" data-language-for-alternatives="uk" data-language-to-translate-into="en" data-phrase-index="0">https://vottp.khmnu.edu.ua/index.php/vottp/</span></span></p>Хмельницький національний університетuk-UAMEASURING AND COMPUTING DEVICES IN TECHNOLOGICAL PROCESSES2219-9365MODELLING OF CYBER-PHYSICAL CONTROL SYSTEMS UNDER NEGATIVE EXTERNAL FACTORS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/454
<p class="06AnnotationVKNUES"><em>The effective modeling of cyber-physical control systems (CPCS) under negative external factors is critical for ensuring their reliability and stability. This paper provides a comprehensive analysis of the modeling approaches used to assess the impact of various external influences, such as environmental conditions, technical failures, and human factors, on the performance of these systems. The study emphasizes the importance of understanding how these factors can lead to significant disruptions in the operation of CPCS, potentially resulting in severe consequences for critical infrastructures.</em></p> <p class="06AnnotationVKNUES"><em>The research investigates different modeling methodologies, including mathematical models and simulation techniques, to evaluate the dynamics of CPCS in the presence of adverse conditions. Specifically, the paper discusses the application of Markov models to describe the system's states and transition probabilities, allowing for a detailed assessment of the likelihood of component failures. Additionally, the use of Monte Carlo simulations is explored as a means to generate various scenarios that account for external influences on system behavior.</em></p> <p class="06AnnotationVKNUES"><em>Moreover, the paper highlights the role of monitoring and control technologies, such as IoT and real-time data analytics, in enhancing the resilience of CPCS. These technologies facilitate continuous monitoring of system components, enabling proactive measures to mitigate the effects of negative external factors. The findings indicate that organizations implementing robust monitoring systems can significantly improve their responsiveness and adaptability to changing conditions.</em></p> <p class="06AnnotationVKNUES"><em>Furthermore, the study provides practical recommendations for designing resilient CPCS that can withstand external impacts. These include integrating advanced information technologies, optimizing data processing methods, and ensuring data security to protect against cyber threats. By adopting these strategies, organizations can enhance the operational efficiency of their systems while minimizing risks associated with external disruptions.</em></p> <p class="06AnnotationVKNUES"><em>The results of this research contribute to the understanding of how to effectively model and manage cyber-physical control systems in challenging environments. This work serves as a foundation for future studies aimed at improving the robustness and reliability of CPCS, ultimately leading to more secure and efficient operations in various industries.</em></p>Stanislav PEREPELITSAMariia YUKHYMUCHUKVladyslav LES'KO
Copyright (c) 2025 Станіслав ПЕРЕПЕЛИЦЯ, Марія ЮХИМЧУК, Владислав ЛЕСЬКО
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27171210.31891/2219-9365-2025-81-1CHECKING OF PHARMACEUTICAL HERBAL REMEDIES (HERBAL MEDICINES) USING AN ELECTRONIC NOSE
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/455
<p><em>Quality control of pharmaceutical herbal products is a key step in ensuring their efficacy, safety and compliance with regulatory standards. The emergence of modern innovations, including the electronic nose, paves the way for rapid and accurate assessment of herbal medicines. A gas sensor allows the detection of aromatic substances and their verification against control parameters necessary to maintain the stability and efficacy of herbal products. This paper examines in detail the fundamental principles of the artificial olfactory device, its use in the regulation of phytoremediation processes and the advantages that this technology offers. This method allows the identification of complex mixtures of volatile compounds, the detection of adulteration, real-time quality control and monitoring of changes associated with the storage or processing of raw materials. A device that simulates human smell, as an advanced screening tool, allows for the rapid and convenient identification of volatile natural solvents that form a unique “olfactory fingerprint” for each sample. This makes it possible not only to manage the caliber of raw materials and products, but also to detect fraud, measure the stability of the caliber during storage, and monitor production procedures. As an electronic nose for quality control of pharmaceutical herbal remedies and herbal medicines, the authors propose to use a multi-channel device for recognising odours and gas concentrations in real time MSRC-2, which was developed at the Department of Information Radioelectronic Technologies and Systems of Vinnytsia National Technical University. This work highlights the concept of the electronic olfactometer, its advantages over conventional approaches, and its appropriate use in the pharmaceutical industry. Recognizing the importance, careful efforts ensure the standardization of herbal medicines using the electronic nose method, contributing to quality control and stability assessment, which is paramount for improving drug safety and developing modern control systems.</em></p>Andrii SEMENOVKateryna BONDARETSOleksandr STALCHENKOAndrii KRYSTOFOROVOleksandr SHPYLOVYI
Copyright (c) 2025 Андрій СЕМЕНОВ, Катерина БОНДАРЕЦЬ, Олександр СТАЛЬЧЕНКО, Андрій КРИСТОФОРОВ, Олександр ШПИЛЬОВИЙ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271132110.31891/2219-9365-2025-81-2APPLICATION OF MACHINE LEARNING METHODS FOR ANALYZING MEAT FRESHNESS DATA
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/456
<p><em>Consumers are increasingly questioning the quality of food products, particularly meat, and demanding reliable information about its freshness and safety. Traditional methods of assessing meat freshness, such as organoleptic and laboratory analyses, are often subjective, labor-intensive, and time-consuming. This has underscored the need for developing effective and rapid methods for determining meat freshness. In this context, modern technologies, including neural networks, have become highly relevant for identifying meat freshness. Neural networks, recognized for their powerful data processing and analysis capabilities, offer precise and automated methods for assessing food quality. This study investigates the application of machine learning methods, particularly using TensorFlow, to analyze meat freshness data. We propose a smart meat quality control system that combines a sensory network and a neural network. The system integrates gas sensors and a color sensor, with software employing a neural network for analysis and decision-making. The structure of the smart system, its operational principles, architectural features, and the training process of the neural network using TensorFlow are thoroughly detailed. Our project involves integrating sensors, an Arduino microcontroller, and a Raspberry Pi single-board computer to develop a system capable of accurately and reliably determining meat freshness. The results demonstrate that the proposed system can significantly enhance meat quality control by providing timely and accurate freshness assessments. This has profound implications for reducing food waste, protecting consumers from potential health risks, ensuring fair pricing, and improving overall quality control in the meat industry. Further research and development aimed at optimizing the model, increasing training data, and integrating real-time analysis are suggested to enhance the system's efficiency and practical application.</em></p>Bohdan BOGUSHOleh ROMANCHUKEVYCH
Copyright (c) 2025 Богдан БОГУШ, Олег РОМАНЧУКЕВИЧ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271222810.31891/2219-9365-2025-81-3AUTHENTICATION METHODS IN EMBEDDED SYSTEMS WITH LIMITED COMPUTING RESOURCES
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/457
<p class="06AnnotationVKNUES"><em>The article discusses various authentication methods used in embedded systems with limited computing resources. In today's world, the growing need for information technology security necessitates the implementation of effective data protection solutions. Embedded systems, as a rule, have limited capabilities in terms of memory, computing power and power consumption, which makes it difficult to implement complex authentication algorithms.</em></p> <p class="06AnnotationVKNUES"><em>The article analyzes the main authentication approaches, including passwords, one-time codes (OTPs), tokens, and multi-factor authentication (MFA). Each of these methods has its advantages and disadvantages, which the authors consider in detail. Authentication using passwords, although an easy method to implement, is vulnerable to match-based attacks. This necessitates the implementation of more complex security policies.</em></p> <p class="06AnnotationVKNUES"><em>At the same time, one-time codes generated for each authentication session provide an additional layer of protection, but their use may require additional resources for generation and verification. Tokens, both hardware and software, provide a higher level of security, but can be less convenient to use. Multi-factor authentication combines several methods, increasing security, but requires more resources and execution time, which can be critical for embedded systems.</em></p> <p class="06AnnotationVKNUES"><em>The authors of the article conducted comprehensive testing of various authentication methods in order to evaluate their effectiveness according to three main parameters: speed, security and resource consumption. The results of the study show that authentication methods differ significantly in execution speed and energy consumption. For example, authentication with passwords proved to be the fastest but with the lowest level of security, while one-time codes and tokens offered a higher level of protection but required more time to process.</em></p> <p class="06AnnotationVKNUES"><em>In conclusion, the article emphasizes the importance of choosing appropriate authentication methods that take into account the specifics of embedded systems. Recommendations from testing can be useful for developers and engineers working in the field, helping to balance security, speed, and resource efficiency. The study points to the need for further developments in the field of authentication to meet the needs of modern embedded systems and guarantee their reliability in the context of data security.</em></p>INNA ROZLOMIIEMIL FAURESERHII NAUMENKO
Copyright (c) 2025 Інна РОЗЛОМІЙ, Еміль ФАУРЕ, Сергій НАУМЕНКО
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271293510.31891/2219-9365-2025-81-4STUDY OF METHODS FOR CONSTRUCTING DECISION TREES FOR THE IMPLEMENTATION OF THE RANDOM FOREST ALGORITHM IN THE MEDICAL FIELD
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/458
<p class="06AnnotationVKNUES"><em>Every year, machine learning is increasingly facilitating areas of modern life, ranging from entertainment services to solving difficult tasks related to improving people's work and lives. It is especially important now to apply these analysis methods in the medical field in order to save as many lives as possible, to diagnose diseases as early as possible for easier or timelier treatment. This work is devoted to the same topic, which aims to develop methods to prevent the development of psychological disorders among patients with hypothyroidism and hyperthyroidism. In one of the observations of this topic, it was determined that the best way to predict possible difficulties is the random forest algorithm, which consists in building various decision trees. It is worth noting that it is necessary to choose the right way to develop each tree from such alternatives as ID3, CHAID, C4.5, CART, and XGBoost. All of them were analyzed using linear additive convolution based on such data as the type of tree building algorithm, data distribution criterion, data types, numerical data processing, tree pruning method, tendency to overlearn, algorithm speed, how clear the model interpretation is, and application methods. First, a table was filled out with data from decision tree methods according to the above-mentioned features, then all qualitative indicators were converted into quantitative ones for mathematical calculations when calculating convolution values for each alternative. According to the results of the experiment, greedy CART is the best algorithm for developing a decision tree model that is easy to interpret, fast, and least prone to overtraining, operates with numerical and categorical data, uses the Gini index to divide data into subsets when determining the next attribute from the list of features, and supports pruning of its structure. After conducting the experiment, the advantages and disadvantages of the chosen model of the multicriteria problem of this work are also considered.</em></p>Nural HULIIEV
Copyright (c) 2025 Нурал ГУЛІЄВ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271364310.31891/2219-9365-2025-81-5IMPLEMENTATION OF A SERVER PROTECTION SYSTEM TAKING INTO ACCOUNT ANOMALIES IN PACKAGES
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/459
<p class="06AnnotationVKNUES"><em>The article discusses methods of real-time network traffic analysis based on statistical methods and machine learning algorithms for classifying network packets by their behavioral characteristics. The presented system implements a multi-level approach to server protection, which includes three main stages: primary data filtering, statistical analysis, and the use of machine learning models. The relevance of this issue stems from the need to ensure real-time server protection, which requires high-speed traffic analysis and system adaptability to emerging threats. Modern solutions must not only detect known threats but also identify new, previously unknown attack patterns by analyzing traffic behavioral characteristics. The early anomaly detection module is a key component of the system, enabling the identification of potentially malicious actions at an early stage. To counter new, previously unknown types of attacks, the use of deep neural networks and clustering algorithms is particularly important, as it allows real-time analysis of traffic behavior patterns. The ability to respond to threats before they can cause harm to the infrastructure ensures effective early detection. The presented models allow adapting to new types of attacks by automatically updating them. This makes it possible to detect both traditional DDoS attacks (port scanning, exploitation of network protocol vulnerabilities and SQL injection attempts) and other types of threats. The integration of the presented protection system with existing monitoring tools and firewalls will ensure the accuracy of early detection of DDoS attacks, low false-positive rates, and reliable real-time protection of servers and ease of implementation. Future development prospects for the system include enhancing machine learning algorithms for precise anomaly detection, expanding the functionality of filtering modules, and integrating with cloud technologies to ensure the protection of scalable infrastructures.</em></p>Petro PONOCHOVNYYuriy PEPA
Copyright (c) 2025 Петро ПОНОЧОВНИЙ, Юрій ПЕПА
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271445110.31891/2219-9365-2025-81-6OBJECTIVITY EVALUATION OF COGNITIVE BIAS MITIGATION IN NEUROEDUCATIONAL STRATEGY WITH THE ALLOCATION OF VULNERABLE INDICATORS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/460
<p class="06AnnotationVKNUES"><em>At the current level of distribution and development of neuro-computer interfaces, an important factor in the effectiveness of their implementation remains the objectivity of usage strategies, which is determined by the constructive, ideological and technical features of the system that embody a certain implementation of a specific task. The work represents the results of a study of the effectiveness of cognitive bias mitigation in neuroeducational strategy. When designing and testing the system, typical objectivity indicators such as reliability, validity, consistency were taken into account. In order to increase objectivity, a number of restrictions were introduced at the stage of candidate selection, test development, testing boundaries, and assessment requirements, the importance of which is assessed separately. The assessment was carried out using different methods to increase the accuracy of variable results. Two strategies were compared in order to determine objectivity indicators, search for accuracy and quality factors, ways of further development, debugging, and implementation. Data from studies of the dynamics of cognitive metrics, neuroplasticity training, neuroliteracy testing, and the relationship between control and cognitive bias in decision-making in professional and educational settings with varying degrees of uncertainty were used. The results can be used to compare the quality indicators of user and educational neuro-computer systems and decision-making support technologies.</em></p>Vitalii MYKHALCHUK
Copyright (c) 2025 Віталій МИХАЛЬЧУК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271525810.31891/2219-9365-2025-81-7PROSPECTS FOR THE USE OF THERMAL IMAGERS FOR DETECTING RADIOELECTRONIC MEANS OF COVERT INFORMATION RETRIEVAL AT INFORMATION ACTIVITY SITES
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/461
<p class="06AnnotationVKNUES"><em>In the conditions of constant improvement of methods and methods of application of means of covert information obtaining at the object of information activity, the issue of their timely identification, detection and neutralization is relevant. The article considers the prospects and possibilities of detecting such means using software and hardware complexes based on thermal imagers by thermal radiation.</em></p> <p class="06AnnotationVKNUES"><em>As a result of the research conducted, the article presents the results of the analysis of the main technical characteristics of modern samples of thermal imagers equipped with microbolometric matrices of leading companies in the world. A formalized model of the thermal channel of information leakage at the OID according to the energy criterion has been developed and on its basis proposals have been made for improving the technical characteristics of thermal imaging devices. The proposals provided are aimed at improving the efficiency of using thermal imaging devices in detecting and protecting against potential threats originating from radio-electronic means of information extraction. It is shown that the proposals provided increase the quality of detection and neutralization of radio-electronic means of information extraction, which can be used by an attacker on the OID.</em></p> <p class="06AnnotationVKNUES"><em>The practical value of the research results lies in improving the quality of detection and neutralization of the work of radio-electronic means of information extraction, which can be used on the OID when conducting special studies by improving the operation of thermal imaging devices.</em></p>Mykola KONOTOPETSOLEKSANDR TUROVSKYIgor SAMOYLOVYevhen KOMONYUK
Copyright (c) 2025 Микола КОНОТОПЕЦЬ, Олександр ТУРОВСЬКИЙ, Ігор САМОЙЛОВ, Євген КОМОНЮК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271596910.31891/2219-9365-2025-81-8USING GRAPH DATABASES IN CYBERSECURITY
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/422
<p><em>This article delves into the application of graph databases within the realm of cybersecurity, emphasizing their exceptional capability to model and scrutinize intricate relationships among various components of information systems. Unlike traditional relational databases, which primarily focus on structured data and predefined schemas, graph databases excel in representing interconnected data, making them particularly suited for capturing the dynamic and multifaceted nature of cyber threats. The primary objective of this study is to elucidate the pivotal role that graph databases play in the detection and prevention of cyberattacks, while also assessing their distinct advantages over conventional relational database systems.</em></p> <p><em>To achieve this objective, the research employs a comprehensive methodology that begins with a thorough review of existing approaches to utilizing graph structures for modeling cyber threats. This involves analyzing how graph databases can effectively map out the complex interactions between different entities such as users, devices, network traffic, and malicious activities. The study further explores the development of innovative methods that seamlessly integrate graph databases with advanced machine learning algorithms and Explainable AI (XAI) techniques. This integration aims to enhance both the accuracy and transparency of cybersecurity systems, ensuring that threat detection mechanisms are not only precise but also understandable to end-users and security analysts.</em></p> <p><em>The findings of the study are compelling, demonstrating that the implementation of graph databases can significantly bolster the accuracy of threat detection by up to 25% compared to traditional relational databases. This improvement is attributed to the graph database's ability to uncover hidden patterns and relationships that are often missed by relational models. Additionally, the response time to security incidents is reduced by approximately 30%, highlighting the efficiency gains achieved through faster data retrieval and processing inherent to graph databases. These enhancements are crucial in a cybersecurity context, where timely detection and response to threats can prevent substantial financial losses and mitigate damage to organizational infrastructure.</em></p> <p><em>Moreover, the integration of Explainable AI (XAI) with graph databases offers substantial benefits in terms of algorithmic transparency. By providing clear and interpretable explanations for the decisions made by machine learning models, XAI fosters greater trust among users and stakeholders. This transparency is vital for compliance with regulatory standards and for enabling security professionals to validate and refine threat detection strategies effectively. The study underscores that the combination of graph databases with XAI not only improves the technical performance of cybersecurity systems but also enhances their usability and reliability from a user perspective.</em></p> <p><em>In conclusion, the research highlights the transformative potential of graph databases in advancing cybersecurity measures. The superior ability of graph databases to model complex relationships, coupled with the precision of machine learning algorithms and the clarity provided by Explainable AI, positions them as indispensable tools in the fight against cyber threats. However, the study also identifies several areas for future research, including the optimization of graph database performance and scalability to handle ever-growing volumes of data and more sophisticated attack vectors. Additionally, there is a pressing need to develop standardized methodologies for integrating graph databases with existing cybersecurity frameworks, ensuring seamless interoperability and maximizing the benefits of these advanced technologies. By addressing these challenges, future developments can further enhance the robustness and effectiveness of cybersecurity systems, ultimately contributing to a more secure digital landscape.</em></p>Andrii SEMENIUKMaria YUKHYMCHUK
Copyright (c) 2025 Андрій СЕМЕНЮК, Марія ЮХИМЧУК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271707810.31891/2219-9365-2025-81-9DINAMICS OF CONNECTIVITY PROBABILITY OF DISTRIBUTED INFORMATION SYSTEM CHANGE OVER TIME CONSIDERING RANDOM INFLUENCE ON ELEMENTS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/462
<p class="06AnnotationVKNUES"><em>The design, subsequent implementation, and operation of information systems are integral elements in solving many applied problems today. Particularly relevant and interesting in several situations in the case where conditions dictate that the developed information system will operate autonomously—that is, without human intervention or with only negligible human involvement. Against this backdrop, studying the functional reliability of an information system becomes especially important.</em></p> <p class="06AnnotationVKNUES"><em>Numerous methods have been developed to quantitatively assess a system's functional reliability. These methods are often referred to as functional reliability indicators and include, for example, connectivity probability, edge and vertex connectivity degree, and many others. However, their primary drawback is that they only allow an assessment of functional reliability at specific moments of operation. Meanwhile, during the design phase, it may be critically necessary to represent them as functions of time. Such a necessity often arises, for example, when studying the degradation dynamics of a designed information system.</em></p> <p class="06AnnotationVKNUES"><em>Currently, there are mathematically grounded attempts to represent some functional reliability indicators as time-dependent functions. However, these attempts generally do not account for possible random influences on individual elements of the information system or on the system as a whole. This paper attempts to generalize the representation of connectivity probability as a time function to the case where the load on each element is not a strictly deterministic variable changing over time but rather a random process with a normal distribution at each moment in time. The study considers a scenario where it is known that the mean and standard deviation of element loads in the information system are periodic functions with a predetermined period, and their values are known at specific moments throughout the period.</em></p>Оleg BARABASHOLEG KOPIIKAAndriy MAKARCHUK
Copyright (c) 2025 Олег БАРАБАШ, Олег КОПІЙКА, Андрій МАКАРЧУК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271798510.31891/2219-9365-2025-81-10ARCHITECTURE OF A DISTRIBUTED FUNCTIONALLY RESILIENT MONITORING SYSTEM FOR VEHICLES BASED ON BLOCKCHAIN TECHNOLOGY
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/463
<p class="06AnnotationVKNUES"><em>The article is devoted to the study of methods for ensuring the functional resilience of distributed information systems for vehicle movement monitoring using blockchain technology. Given the rapid development of intelligent transportation systems and the growing requirements for their reliability, ensuring the uninterrupted operation of such systems is critically important for the safety and efficiency of road traffic. In particular, the main challenges for such systems remain resilience to hardware and software failures, adaptation to dynamic changes in transportation infrastructure, protection against cyberattacks, and ensuring the integrity and reliability of data.</em></p> <p class="06AnnotationVKNUES"><em>The aim of the study is to develop an architecture for distributed systems based on blockchain technology, capable of withstanding failures, cyberattacks, and dynamic load changes. The primary focus is placed on the integration of decentralized consensus mechanisms, smart contracts, and automatic recovery algorithms to ensure operational stability even in cases of partial system failures. Blockchain technology is considered one of the most promising platforms for addressing these challenges due to its unique characteristics, including a decentralized approach, cryptographic protection, immutability of records, and data transparency. The integration of blockchain into vehicle monitoring systems enables the creation of fault-tolerant architectures capable of functioning even under conditions of partial node failures. Additionally, the use of decentralized consensus mechanisms and smart contracts facilitates process automation and enhances the overall efficiency of traffic flow management. The article proposes a software architecture that integrates blockchain technology, decentralized consensus mechanisms, and smart contracts to automate management processes and ensure system adaptability.</em></p> Olena BANDURKARoman ROMANIVOlha SVYNCHUK
Copyright (c) 2025 Олена БАНДУРКА, Роман РОМАНІВ, Ольга СВИНЧУК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-271869610.31891/2219-9365-2025-81-11CHROME EXTENSION FOR MAINTAINING HEALTH DURING BROWSER WORK
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/425
<p class="06AnnotationVKNUES"><em>This paper is dedicated to the research and implementation of a browser extension that helps users maintain their health during prolonged computer work. The main tasks of this extension include reminding users about the need for regular breaks to perform physical exercises that help relieve muscle tension in the back and eyes, providing users with the ability to adjust the frequency and type of reminders according to their individual needs and preferences, and providing access to specialized exercises and recommendations through web pages that contribute to improved health and well-being.</em></p> <p class="06AnnotationVKNUES"><em>An analysis of alternatives highlighted the importance of creating an extension that includes the following features: reminders for eyes and posture, informing about the importance of maintaining water balance in the body, the ability to activate and deactivate notifications, configure reminder times, a page with exercises for posture and eyes, a countdown timer during exercises, posture correctness checking, and notifications about incorrect posture even if the user is on other tabs.</em></p> <p class="06AnnotationVKNUES"><em>Functional and non-functional requirements for the extension were analysed and a use case diagram was created. The software product was implemented using the Chrome API, utilizing the following APIs: storage (for storing user data), alarms (for time tracking), notifications (for sending notifications to the user), tts (for text-to-speech), and tabs (for managing browser tabs). The choice of technologies used for the development of the software product was justified: TypeScript programming language, React library, MUI component library for creating user interfaces, Visual Studio Code as the development environment, and Vite for project build.</em></p> <p class="06AnnotationVKNUES"><em>The results of machine learning were used to estimate posture.</em></p>Olha TARNOVETSKAKateryna HAZDIUKOleksandr VALValeriia VODYANCHUK
Copyright (c) 2025 Ольга ТАРНОВЕЦЬКА, Катерина ГАЗДЮК, Олександр ВАЛЬ, Валерія ВОДЯНЧУК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-2719710710.31891/2219-9365-2025-81-12JUSTIFICATION OF THE CHOICE OF THE ELEMENT BASE OF THE STABILIZATION SYSTEM OF THE MOBILE ROBOT EQUIPMENT OF THE "MINI" CLASS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/430
<p><em>The article is devoted to the development of the equipment stabilization system, which is installed on wheeled mobile robot of the "mini" class, namely, to the justification of the choice of functional elements of the system. For the effective use of various equipment (digital cameras, antennas, radars, various devices) when moving a mobile robot over terrain with a complex surface profile, it is important to ensure stabilization of the equipment in the horizon plane.</em></p> <p><em>The paper analyzes the modern elemental base that can be used to develop a stabilization system in accordance with the requirements for the accuracy and speed characteristics of functional elements, and taking into account the limitations of mass and dimensional characteristics. As a sensitive element of the stabilization system, the use of a Attitude and heading reference system (</em><em>AHRS</em><em>), built on the basis of technologies of microelectromechanical systems, is substantiated. According to the results of a comparative analysis of the characteristics of such </em><em>AHRS</em><em> of different manufacturers, the choice of </em><em>AHRS</em><em> with the most acceptable mass and dimensional characteristics and the accuracy of determining the angles of deviation of the platform from the horizon plane is substantiated. A comparative analysis was carried out and the choice of an executive motor was justified among brushless direct current electric motors, small in size and weight, which provide a moment sufficient to solve the task of stabilizing a platform weighing 4-5 kg. The existing architectures of microcontrollers are considered and the choice of the main microcontroller on which the conversion, control algorithms, adaptive digital regulators of the equipment stabilization system will be implemented is justified.</em></p> <p><em>The characteristics of the selected functional elements of the stabilization system will be used in the developed models of the system when conducting computer simulations and researching the quality of the stabilization process in complex conditions of mobile robot movement. Based on the selected functional elements, a physical model of the stabilization system will be created and tested for installation on a "mini" class mobile robot.</em></p>Andrii OSOVTSEVNadiya BOURAOU
Copyright (c) 2025 Андрій ОСОВЦЕВ, Надія БУРАУ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27110811710.31891/2219-9365-2025-81-13OPTIMIZATION OF METROLOGICAL SUPPORT FOR INTEGRATED CONTROL SYSTEMS IN THE CONTEXT OF DIGITAL TRANSFORMATION
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/464
<p class="06AnnotationVKNUES"><em>The article is devoted to the study of the optimization of metrological support for integrated control systems in the context of digital transformation. It highlights that metrology is a crucial element in the effective functioning of such systems, as it ensures the accuracy and reliability of measurements. Particular attention is given to the impact of digital technologies on the automation of measurement processes, the integration of modern metrological instruments, and the creation of digital platforms for data management. The article examines how digital solutions contribute to the efficiency of integrated control systems, reducing costs and improving productivity.</em></p> <p class="06AnnotationVKNUES"><em>Digital technologies are becoming the foundation for implementing new approaches in measurement processes, allowing the integration of various measuring instruments into a unified system for automated data collection, processing, and analysis. This reduces the likelihood of errors, shortens the time and costs associated with processing measurement results, and increases the overall reliability of the system. Data management platforms enable the efficient processing of large volumes of information, ensuring the accuracy and timeliness of decision-making in the process of quality and safety management.</em></p> <p class="06AnnotationVKNUES"><em>The article also highlights the importance of improving metrological support in the context of rapid digital transformation to achieve high standards of quality and safety across various industries. It proposes specific steps to enhance the metrological support of such systems, including the implementation of advanced digital tools and standards that meet the requirements of modern production and management. The adoption of these solutions will not only increase the efficiency of integrated control systems but will also contribute to the overall development of the industry, ensuring high technological standards and the competitiveness of enterprises.</em></p>Yuliia SALABAINazar KURYLYAK
Copyright (c) 2025 Юлія САЛАБАЙ, Назар КУРИЛЯК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27111812110.31891/2219-9365-2025-81-14THE APPLICATION OF ARTIFICIAL INTELLIGENCE FOR MANAGING HUMANCENTRED PROJECT OF INFORMATION TECHNOLOGIES
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/465
<p class="06AnnotationVKNUES"><em>The article explores the impact of modern artificial intelligence (AI) tools on the management of human-centered information technology (IT) projects. The relevance of integrating artificial intelligence technologies into project management processes in the context of rapid digital transformation, competitive pressure, and the growing complexity of requirements for IT products focused on the current users' needs is considered. The authors propose the idea and concept of a software solution that will provide opportunities to evaluate the effectiveness of AI implementation in order to optimize project management processes for various IT areas of human activity. The research highlights the key benefits of integrating artificial intelligence into the project management system, including automation of routine tasks, improving the quality of analytics, adaptability to changing conditions, improving interaction between project participants, testing different versions of IT products, behavioral research, and target audience segmentation. The mechanisms of AI application at different stages of the IT project life cycle are described, including initiation, idea generation, planning, implementation, monitoring, and completion. Particular attention is paid to the use of tools such as chatbots, machine learning models, and large language models (LLM), which help to improve customer focus, quality of the final product, and efficiency of teamwork coordination.</em></p> <p class="06AnnotationVKNUES"><em>The proposed software solution contains modules for data entry, analysis, forecasting, visualization, and reporting, which allows it to be flexibly adapted to various areas of IT projects. It is noted that the implementation of AI changes traditional approaches to management, creating conditions for the generation of new business models and management strategies. The author also emphasizes the importance of ensuring data privacy and security in the process of AI implementation.</em></p> <p class="06AnnotationVKNUES"><em>The results of the study indicate that the integration of artificial intelligence contributes to a significant increase in the efficiency of IT project management, allowing companies to remain competitive in the face of rapid market changes and constant challenges. The presented approach can be useful for project managers, analysts, and developers interested in using the latest technologies to increase productivity, customer focus, and achieve strategic goals.</em></p>Denys GUDAKOVYanina KOLODINSKA
Copyright (c) 2025 Денис ГУДАКОВ, Яніна КОЛОДІНСЬКА
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27112212910.31891/2219-9365-2025-81-15INTELLIGENT SYSTEM FOR ENSURING UAV SWARM FLIGHT UNDER JAMMING AND THE IMPACT OF ENEMY COUNTER-DRONE SYSTEMS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/466
<p><em>This article addresses the critical and timely task of employing strike FPV drones in kamikaze mode to neutralize enemy counter-unmanned aerial systems (C-UAS), trench-based electronic warfare systems (EWS), and disrupt enemy logistics within a tactical range of 1 to 50 kilometers. Given the increasing sophistication of adversarial countermeasures, such as jamming and interception systems, this study emphasizes the need for a robust and adaptive approach to ensure the operational effectiveness of kamikaze drones in highly contested environments. The study proposes a comprehensive algorithm for UAV flight operations that integrates multiple protection strategies against various threats, including C-UAS systems, EWS, tactical missile systems (TMS), and advanced radar systems. The flight mode mimics that of a cruise missile, maintaining low-altitude trajectories (1–20 meters) to evade detection and mitigate the effects of enemy countermeasures. The low-altitude profile also enables obstacle avoidance, critical for terrain navigation in complex battlefield conditions.</em></p> <p><em>A key contribution of this work is the development of a novel approach to utilizing intelligent systems (IS) to maximize the operational range of FPV kamikaze drones. These systems are designed to function effectively under adversarial jamming conditions, the absence of satellite navigation, and the inherent limitations of inertial navigation systems. The proposed methodology ensures high mission accuracy, achieving target precision within +0.2 meters, even at ultra-low altitudes of 0.5 to 2 meters.</em></p> <p><em>To achieve these results, the paper introduces an integrated IS-based optimization framework that evaluates and adjusts flight parameters in real time. The optimization criteria include maintaining an uninterrupted line-of-sight for signals, ensuring high-quality data transmission with minimal delay, enhancing resistance to electronic jamming, selecting optimal frequency bands (7 GHz to 38 GHz), and stabilizing UAV flight dynamics. Advanced on-board capabilities such as real-time wave propagation analysis, electronic countermeasure assessment, visual odometry, obstacle avoidance, and horizontal flight stabilization form the backbone of the proposed approach. These innovations collectively enable the drones to execute their missions with maximum efficiency, even in highly adverse conditions. The study also outlines key technical requirements for next-generation unmanned aerial systems (UAS) to enhance their resilience and operational efficiency, paving the way for further advancements in drone-based tactical operations.</em></p>Oleksandr SALIYSergii VOYTENKOVictor TARANENKOJuliy BOIKOVolodymir DRUZHYNIN
Copyright (c) 2025 Олександр САЛІЙ, Сергій ВОЙТЕНКО, Віктор ТАРАНЕНКО, Юлій БОЙКО, Володимир ДРУЖИНІН
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27113014310.31891/2219-9365-2025-81-16COMPARATIVE ANALYSIS OF TEMPERATURE SENSORS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/452
<p class="06AnnotationVKNUES"><em>The article discusses different types of temperature sensors based on different operating principles, such as phase-locked loop (PLL), proportional-to-absolute-temperature (PTAT), bipolar transistors, delay lines, temperature-dependent oscillators, binary counters, and periodic oscillators. Their main characteristics, advantages and disadvantages are described. PLL-based sensors have high accuracy and stability, but complex construction. PTAT sensors are characterized by linearity and low power consumption. Sensors based on bipolar transistors provide high accuracy and wide measurement ranges. Delay lines provide fast response times but require fine tuning. Temperature-dependent oscillators and binary counters have high accuracy and low power consumption, but can be sensitive to electromagnetic interference. Sensors that measure the period of the signal provide high accuracy and are suitable for energy-efficient applications.</em></p> <p class="06AnnotationVKNUES"><em>The article provides a detailed overview of each type of sensor, emphasizing their relevance and application in various fields of electronics.</em></p>Oleksandr MALIUKVolodymyr MARTYNIUK
Copyright (c) 2025 Олександр МАЛЮК, Володимир МАРТИНЮК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27114415110.31891/2219-9365-2025-81-17ANALYSIS OF IMAGE GENERATION MODELS WITH TEXTUAL ELEMENTS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/445
<p><em>This article addresses the problem of generating images with integrated textual content, which is a highly relevant challenge for modern artificial intelligence technologies. Despite significant advancements in image generation using diffusion models, accurately rendering text remains a challenge due to the complexity of maintaining correct character sequences and textual elements placement. The study aims to evaluate the ability of four modern models (DALL-E, Flux, RecraftV3, and TextDiffuser-2) to generate high-quality text with varying input lengths and identify critical points at which the quality of textual elements on generated images significantly deteriorates.</em></p> <p><em>For the experimental part, a set of text prompts was created, ranging from 1 to 15 words, including simple words, short phrases, and more complex sentences. Each prompt was processed ten times by each model, providing a representative sample of results. The analysis of the generated images allowed for identifying critical points—the text lengths at which the models fail to produce correct text—and classifying typical errors in the generated images.</em></p> <p><em>The results indicate significant differences between the models: RecraftV3 demonstrated the highest stability, maintaining text accuracy up to 14 words, while DALL-E-3 and Flux-1-Pro showed quality degradation after 5 words. TextDiffuser-2 exhibited a high error rate, limiting its use in tasks where accuracy is critical. The study’s findings have practical value for further improving image generation algorithms, particularly in advertising, design, and automated visual content creation.</em></p>Roman SHAPTALAYaroslava YAKOVENKO
Copyright (c) 2025 Роман ШАПТАЛА, Ярослава ЯКОВЕНКО
https://creativecommons.org/licenses/by/4.0
2025-03-102025-03-10115215910.31891/2219-9365-2025-81-18METHODS FOR ORGANIZING THE FUNCTIONING OF MULTI -COMPUTER SYSTEMS OF ANTIVIRAL COMBINED BAITS AND TRAPS IN CORPORATE NETWORKS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/467
<p class="06AnnotationVKNUES"><em>The work has developed methods for organizing decision -making and functioning of fraudulent systems on the basis of previous experience of functioning and different options for completing tasks. To do this, formal submission of components in the architecture of multi -computer systems and the connections between them. It is proposed to differentiate the system center and decision -making controller. The tasks of the Center of the system include preparation of options for completing tasks, and the decision -making controller includes evaluation of options for completing tasks, taking into account the previous experience of their application and the choice of one of them. Analytical expressions have been developed to describe the processes in multicomputer systems used in systems to ensure the ability of systems to make independent decision -making on the tasks performed.</em></p> <p class="06AnnotationVKNUES"><em>The purpose of the work was to improve decision making with multi -computer systems with combined antiviral baits and traps on further steps by forming polymorphic answers to events, taking into account the previous experience of using the response and functioning of systems.</em></p> <p class="06AnnotationVKNUES"><em>According to the results of the proposed decisions, the prototype of the system was developed and experiments were conducted with it. The experiments were conducted for the case with the choice of one of the five variants of tasks and for the case of just one option. According to the results of the experiment, it was found that the stability of the system during its functioning is the best for the first case, that is, taking into account the proposed decisions, compared to the traditional approach in the second option. Thus, developed decisions on the functioning of systems, taking into account the previous experience, allowed to suggest more stable systems.</em></p>Antonina KASHTALIAN
Copyright (c) 2025 Антоніна КАШТАЛЬЯН
https://creativecommons.org/licenses/by/4.0
2025-03-102025-03-10116017110.31891/2219-9365-2025-81-19ANALYSIS OF RETRIEVAL-AUGMENTED GENERATION METHOD IN THE AREA OF LEGAL CONTRACTS GENERATION
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/468
<p class="06AnnotationVKNUES"><em>The subject of the study is the method of Retrieval-Augmented Generation of machine learning for generating contracts under limited resources and methods of comparing and assessing their effectiveness. The work aims to analyze the method of search-augmented text generation for the development of independent specialized systems and to assess their effectiveness for generating contracts in different languages in different legal systems. The following tasks are solved in the article: determining the method of Retrieval-Augmented Generation for adapting models to narrowly focused industries; analyzing the methods of evaluating and comparing such systems; identifying the limitations of existing solutions and approaches; finding the optimal approach under limited resources. The following results were obtained: the method of RAG was investigated in combination with large language models; the architecture of systems based on this approach, its main structural components, and possible variations were considered; the advantages and disadvantages of use in specialized industries were determined; two methods were compared: large general-purpose language models without additional tuning and modifications and systems using RAG for adaptation to the selected subject area. As a result of a practical experiment, it was determined that the RAG method significantly improves the accuracy and completeness of answers in the legal field compared to standard models but requires more time and computational resources. The article provides an overview of Retrieval-Augmented Generation method for text information generation using large language models, considers their advantages and disadvantages, and discusses the possibility of application in limited financial and human resources conditions. The paper considers a specialized legal field and the problem of contract generation and determines the effectiveness of the selected method for its solution.</em></p>Vitalii VOLOKHOVSKYI
Copyright (c) 2025 Віталій ВОЛОХОВСЬКИЙ
https://creativecommons.org/licenses/by/4.0
2025-03-102025-03-10117217910.31891/2219-9365-2025-81-20ANALYSIS OF TRAFFIC ANOMALY DETECTION MODELS IN MODERN INFORMATION AND COMMUNICATION SYSTEMS AND NETWORKS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/471
<p><em>The aim of the study is to analyze existing models for detecting anomalies in network traffic to assess their advantages and disadvantages, as well as to develop criteria for determining the feasibility of using these models in information and communication systems (ICS). This allows for a deeper understanding of the capabilities and limitations of different approaches to detecting threats in networks and assessing the effectiveness of the methods used to ensure system security. This paper provides a detailed analysis of current research in this area, which collects various approaches to detecting attacks such as SQL injections, DoS attacks, botnets, man-in-the-middle attacks, and other network traffic anomalies. The work focuses on comparing models such as machine learning, fuzzy logic, hybrid models, the use of neural networks, genetic algorithms, autoencoders, as well as traditional methods, including signature analysis and data classification. One of the main tasks is to develop criteria by which to compare models, including the type of attack, the approach used, the complexity of the configuration, the load on the network, the analysis of incoming and outgoing traffic, and the accuracy of the model. These criteria help determine which model is the best for a particular type of attack, which is most suitable for working in resource-limited environments or for use in scalable systems. The study collected data on the effectiveness of various models based on real-world examples, demonstrating their accuracy, ability to adapt in real time, and efficiency in processing large volumes of network traffic. Hybrid models that combine different methods to increase the efficiency of anomaly detection were also considered. Despite the high accuracy results, there are limitations for some models, such as high setup complexity or computational costs. In particular, the use of methods based on genetic algorithms requires significant computing resources, while simpler models based on machine learning can be quickly set up and work effectively with limited resources. In other words, the accuracy and speed of models are directly related to their ability to integrate into existing ICS, where computing limitations and data processing speed requirements are also important. In addition, the impact of network load on the effectiveness of an anomaly detection system is considered, where it was found that for large volumes of traffic, the choice of low-load methods is critical. Models with a high level of computational costs can adversely affect network performance, which is an important aspect when implementing them in real-world conditions.</em></p>Nataliia PETLIAK
Copyright (c) 2025 Наталія ПЕТЛЯК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27118018610.31891/2219-9365-2025-81-21MULTILEVEL MODELING AND STATE IDENTIFICATION OF NODES IN PUBLIC INFORMATION AND TELECOMMUNICATION NETWORKS USING AN INFORMATION SECURITY MONITORING SUBSYSTEM
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/470
<p><em>The article examines a multilevel approach to analyzing complex technical systems for modeling hazardous and critical information security (IS) events in the elements and nodes of public information and telecommunication networks (ITN). A methodology is proposed to enhance the effectiveness of the IS monitoring subsystem at various logical levels of the ITN structure. Methods of general systems theory, game theory, reliability theory, fuzzy set theory, probability theory, mathematical statistics, classification theory, and graph theory are utilized to formalize the processes of threat modeling and risk assessment. A multilevel model of critical IS events is proposed for both the entire system and its individual components. The model considers attack scenarios, threat propagation, and vulnerabilities that can be exploited by adversaries with different levels of capability and intent. A probabilistic graph of unauthorized access realization is developed, accounting for different types of intruders, including external and internal actors with varying degrees of privilege escalation. The study analytically describes four IS state classes, considering Type I and Type II control errors, and introduces a refined classification of IS incidents based on their severity and impact on network functionality. Mathematical expressions for evaluating the probabilities of network compromise, system resilience, and normal operation under different threat conditions are derived. The proposed approach enables the assessment of ITN security levels even under incomplete information about the intruder’s strategies and available resources. The impact of threats at different system levels is considered without linking to a specific entry point, which allows for a more comprehensive analysis of network resilience. Additionally, practical recommendations are provided for improving IS monitoring through adaptive security policies and automated response mechanisms. The effectiveness of the proposed methodology is validated through simulation experiments, demonstrating its applicability in real-world network environments. The findings contribute to the development of proactive IS strategies aimed at minimizing risks and ensuring ITN stability under various cyber threat conditions.</em></p>Andrii PROKOPENKOMykyta TRENOV
Copyright (c) 2025 Андрій ПРОКОПЕНКО, Микита ТРЕНЬОВ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27118719410.31891/2219-9365-2025-81-22ЗГОРТКОВА НЕЙРОННА МЕРЕЖА З ПРОЕКТИВНО-IНВАРIАНТНИМ ПУЛIНГОМ
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/433
<p><em>This paper addresses the problem of image classification under projective transformations and proposes a convolutional neural network (CNN) architecture incorporating a projective invariant pooling layer. Unlike classical affine transformations, for which well-known equivariant transformations exist (e.g., steerable convolutional neural networks, harmonic H-Nets), the problem of finding projective equivariance remains open. This paper takes a step towards solving this problem and proposes an implementation of projectively invariant pooling. Compared to a standard CNN, we demonstrate that incorporating such pooling enhances the robustness of our network to projective distortions. Experiments are conducted on the proMNIST and rotoMNIST image datasets, generated from the standard MNIST dataset by applying corresponding transformations.</em></p>Anna BEDRATYUK
Copyright (c) 2025 Ганна БЕДРАТЮК
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27120120910.31891/2219-9365-2025-81-24NEURAL NETWORK METHOD FOR DIAGNOSING PSYCHOLOGICAL DISEASES BY ANALYZING MESSAGES BASED ON A SEPARATE APPROACH TO CLASSIFICATION
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/453
<p><em>The present work considers the use of neural network methods for diagnosing psychological diseases through the analysis of messages. The study shows that the use of NLP technologies and deep neural networks for automated assessment of the psycho-emotional state of individuals can significantly increase the accuracy of diagnostics, as well as ensure scalability and innovation in the field of healthcare. The use of such technologies also corresponds to the global goals of sustainable development, in particular in improving mental health, developing digital infrastructure and reducing inequalities in access to medical services.</em></p> <p><em>The proposed method is based on a separate approach to the classification of various psychological diseases, which increases the accuracy and reliability of diagnostics. Each psychological diseases is analyzed separately, which avoids mutual influence during classification. The diagnostic process includes three main stages: tokenization of texts using appropriate tokenizers, processing of tokens using neural network models, which were trained from scratch on specialized text sets, and drawing conclusions about the probability of the presence of each of the diseases.</em></p> <p><em>Two open datasets were used to train the models. Experimental results showed high efficiency of the proposed method: accuracy values vary within 0.81–0.90, and Precision, Recall and F1-score indicators reach 0.91, which indicates high classification accuracy and ability to differentiate psychological states. The proposed method demonstrates better results compared to existing analogues and has great potential for use in automated detection of mental diseases. Further research can be aimed at improving the robustness of models to language variations and expanding the diagnosed states.</em></p>Oleksandr OVCHARUKOleksandr MAZURETS
Copyright (c) 2025 Олександр ОВЧАРУК, Олександр МАЗУРЕЦЬ
https://creativecommons.org/licenses/by/4.0
2025-02-272025-02-27121021610.31891/2219-9365-2025-81-25DETERMINATION OF OPTIMAL PARAMETERS OF THE PANDA ARM ROBOT MANIPULATOR
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/472
<p class="06AnnotationVKNUES"><em>When using robotic manipulators at enterprises, the task of determining the optimal parameters of robotic manipulators that would take into account the parameters of its functioning, namely the absence of obstacles on the trajectories of movement, the safety of workers working near the robotic manipulator and the minimum energy consumption during the technological process, is relevant.</em></p> <p class="06AnnotationVKNUES"><em>In the process of performing tasks by robotic manipulators within the technological process, there are a large number of possible parameters of its functioning, which include, for example, the trajectory of movement, the coordinates at which objects are captured, etc. In this paper, we consider a set of possible trajectories for a robot to follow when picking up objects from boxes and moving them to a conveyor. The motion trajectories were obtained by creating a virtual prototype of the robot and calculated by the parabolic mixing method.</em></p> <p class="06AnnotationVKNUES"><em>The resulting motion trajectories were analyzed in terms of energy consumption. As a result, it was found that reducing the length of the linear trajectory of the robot's movement leads to an increase in energy consumption; reducing the bending of the joints when moving along the trajectory leads to a decrease in energy consumption; the motor of the first joint requires the least energy; the main share of the energy consumed is the total length of the trajectory traversed by all the joints of the robot manipulator; the length of the linear trajectory of the robot manipulator does not significantly affect the lifting of objects, so it can be considered only in terms of energy consumption.</em></p>Andriy SEMENYSHENIuliia SOKOLANPavlo MAIDANDenys MAKARYSHKIN
Copyright (c) 2025 Андрій СЕМЕНИШЕН, Юлія СОКОЛАН, Павло МАЙДАН, Денис МАКАРИШКІН
https://creativecommons.org/licenses/by/4.0
2025-03-182025-03-18121722410.31891/2219-9365-2025-81-27ANALYSIS OF THE IMPACT OF INPUT PARAMETERS ON THE ACCURACY OF GRAIN TEMPERATURE FORECASTING USING NEURO-FUZZY SYSTEMS
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/443
<p class="06AnnotationVKNUES"><em>The article analyzes the impact of input parameters on the accuracy of grain temperature forecasting in storage facilities using an Adaptive Neuro-Fuzzy Inference System (ANFIS). Particular attention is given to identifying the most significant parameters that ensure the high efficiency of the model, including incorporating the current grain temperature as an integral indicator reflecting the energy state of the grain mass. Adding this parameter allowed the model to account for initial conditions and significantly improve the prediction of thermal processes.</em></p> <p class="06AnnotationVKNUES"><em>In addition to numerical parameters, such as air temperature, relative humidity, and wind speed, the model also considers time series formed based on previous values of grain temperature and environmental conditions. For this purpose, a sliding window with a width of 8 intervals (2 hours) was used, enabling the model to analyze short-term temperature changes and the dynamics of external conditions. This approach enhances forecasting accuracy by accounting for dependencies between the current and previous states of the grain mass.</em></p> <p class="06AnnotationVKNUES"><em>Special attention is paid to integrating qualitative parameters, such as the type and variety of grain, represented as linguistic variables using fuzzy logic. The use of expert data facilitated the creation of a rule-based system that adapts to the specifics of different grain crops and ensures the model's high flexibility under various storage conditions.</em></p> <p class="06AnnotationVKNUES"><em>The modeling results demonstrated that the developed ANFIS model with an optimal set of parameters achieves significantly lower RMSE values compared to baseline models such as ARIMA and LSTM. In particular, the model confirmed its superiority in forecasting accuracy for various storage zones, ensuring temperature stability and timely detection of thermal self-heating risks. The obtained results highlight the importance of considering both numerical and qualitative parameters to improve the efficiency of automated grain storage temperature monitoring systems.</em></p>Andrii LISHCHUK
Copyright (c) 2025 Андрій ЛІЩУК
https://creativecommons.org/licenses/by/4.0
2025-03-182025-03-18122523410.31891/2219-9365-2025-81-28TEMPORARY AND PERMANENT SOLUTIONS FOR THIRD-PARTY SERVICE INTEGRATION IN THE VIRTUAL EDUCATIONAL ENVIRONMENT KSU24
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/451
<p class="06AnnotationVKNUES"><em>Virtual educational environments have become a fundamental tool in modern education, meeting the evolving demands of contemporary learning. These platforms offer a broad range of functionalities, including interactive course materials, collaborative tools, and personalized learning pathways, making them indispensable in higher education institutions worldwide.</em></p> <p class="06AnnotationVKNUES"><em>One of the key functional requirements for such environments is the efficient integration with Ukraine’s state electronic services and existing educational software platforms already implemented in higher education institutions. Seamless integration ensures that administrative and academic processes are optimized, reducing redundancy and enhancing overall institutional efficiency. This interoperability allows students and faculty to access vital resources, manage academic records, and communicate effectively within a unified digital ecosystem.</em></p> <p class="06AnnotationVKNUES"><em>A significant challenge lies in managing and overseeing the implementation of university business processes within the virtual educational environment. These processes include student enrollment, grading, course scheduling, accreditation compliance, and financial transactions. Ensuring smooth execution requires a robust system capable of handling diverse data structures and regulatory requirements while maintaining security and data integrity.</em></p> <p class="06AnnotationVKNUES"><em>To address these challenges, the development of an integrated data archiving and registration system based on the JSON format is proposed. JSON provides a lightweight, flexible, and widely accepted data interchange format that supports efficient data management and interoperability. The proposed system aims to streamline data storage, retrieval, and exchange while ensuring compatibility with existing platforms. By leveraging JSON-based architectures, educational institutions can enhance operational efficiency, facilitate secure data transactions, and improve the overall digital learning experience.</em></p> <p class="06AnnotationVKNUES"><em>This research will explore the technical implementation of such a system, its impact on educational process management, and the potential benefits it offers to both students and administrative staff. The findings will contribute to the ongoing development of smarter, more adaptive virtual educational environments that align with global trends in digital education.</em></p>Denis SENCHYSHYNOleksandr LEMESHCHUK
Copyright (c) 2025 Денис CЕНЧИШЕН, Олександр ЛЕМЕЩУК
https://creativecommons.org/licenses/by/4.0
2025-03-182025-03-18123524310.31891/2219-9365-2025-81-29SIMULATION MODEL FOR PREDICTIVE EVALUATION OF POST-PRINTING PROCESS QUALITY
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/380
<p><em>The methodology for predictive evaluation of the quality of post-press processes has been developed using fuzzy logic methods and tools. A fuzzy system for assessing the quality of the investigated technological process has been created, consisting of two main components: fuzzification and defuzzification. Fuzzification converts crisp values into fuzzy variables. The stages of fuzzification include generating partial quality indicators, defining a universal set of values and corresponding term sets, developing a fuzzy inference model, processing membership functions of linguistic variables, creating knowledge bases, knowledge matrices, and fuzzy logic equations. Defuzzification converts fuzzy results into solutions applicable in real-world conditions. This process involves creating tables of membership function values, calculating fuzzy logic equations, and obtaining a specific integral quality indicator for post-press processing of book publications.</em></p> <p><em>An algorithm for a simulation model for predictive evaluation of post-press process quality has been developed based on the fuzzy system. The software product Post Press Forecast was created to determine the predicted integral quality indicator based on user-defined values of universal sets of linguistic variables. The reverse operation involves selecting the desired integral indicator value and setting acceptable deviation limits, resulting in a table of possible parameter combinations for factors. This enables the selection of optimal parameters to achieve the expected result, significantly reducing the number of substandard copies and shortening the time-to-market for book publications. As a result, production profitability, quality, and competitiveness of the final product are enhanced.</em></p>Alona KUDRIASHOVAIryna PIKHOleh LYTOVCHENKOVolodymyr PETRYK
Copyright (c) 2025 Альона КУДРЯШОВА, Ірина ПІХ, Олег ЛИТОВЧЕНКО, Володимир ПЕТРИК
https://creativecommons.org/licenses/by/4.0
2025-03-252025-03-25124424910.31891/2219-9365-2025-81-30COMPARISON OF MULTI-CHANNEL ACCESS MODES IN THE IEEE 802.11BE NETWORK
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/474
<p class="06AnnotationVKNUES"><em>The article showcases the technical capabilities of the new Wi-Fi 7 standard (802.11be), particularly focusing on the implementation of multi-link operation (MLO) technology to enhance bandwidth, reduce latency, and improve overall network efficiency. The study delves into the fundamental principles of MLO and its role in optimizing wireless communication by enabling devices to simultaneously utilize multiple frequency bands for data transmission.</em></p> <p class="06AnnotationVKNUES"><em>A detailed examination of the operational modes of multi-link devices is provided, including the Multi-Link Single Radio (MLSR) and Multi-Link Multi-Radio (MLMR) architectures. Furthermore, the paper introduces and analyzes the enhanced versions of these architectures, namely Enhanced Multi-Link Single Radio (EMLSR) and Enhanced Multi-Link Multi-Radio (EMLMR), highlighting their improvements in network stability, interference mitigation, and throughput efficiency. The study also discusses specific data transmission features that make Wi-Fi 7 a significant advancement over its predecessors.</em></p> <p class="06AnnotationVKNUES"><em>To assess the real-world performance of Wi-Fi 7 networks, the research incorporates simulation-based evaluations of network performance under various configurations involving multi-link transmission and competing devices. These simulations investigate key performance metrics, such as throughput, latency, and packet delivery efficiency, under different levels of network congestion and interference.</em></p> <p class="06AnnotationVKNUES"><em>Results from these simulations underscore the benefits of multi-channel access in scenarios requiring high performance and minimal latency, such as gaming, video streaming, and enterprise-level networking. The study concludes that the adoption of Wi-Fi 7 and MLO technology has the potential to significantly enhance wireless networking capabilities, paving the way for a more robust and responsive internet experience in high-demand environments.</em></p>Denis TABOR
Copyright (c) 2025 Денис ТАБОР
https://creativecommons.org/licenses/by/4.0
2025-03-252025-03-25125025510.31891/2219-9365-2025-81-31RESEARCH OF METHODS OF AUTOMATED MANAGEMENT OF REGISTRATION OF PEOPLE IN THE ROOM
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/475
<p class="06AnnotationVKNUES"><em>The article studies the methods of registering people, which form the basis of modern access control systems. These systems are designed to personalize and regulate access to premises with restricted entry. They ensure that only authorized personnel or individuals with specific permissions can enter certain areas, thereby enhancing security and minimizing unauthorized access risks.</em></p> <p class="06AnnotationVKNUES"><em>The foundation of such systems lies in the use of a complex of interconnected equipment and software that together manage identification processes and access control mechanisms. These systems are broadly classified into two main types: autonomous and networked. Autonomous systems function independently, without requiring a central control unit, making them simpler and more reliable for smaller-scale implementations. In contrast, networked systems operate under centralized control, allowing for seamless integration with broader security infrastructure and providing enhanced monitoring and reporting capabilities.</em></p> <p class="06AnnotationVKNUES"><em>The article examines the specific features of each type of access control system, analyzing their advantages and disadvantages. Autonomous systems are typically easier to install and maintain but may lack the scalability and flexibility of networked solutions. Networked systems, on the other hand, offer better management options and integration with other security measures, yet they require more complex deployment and ongoing support.</em></p> <p class="06AnnotationVKNUES"><em>Additionally, the article reviews the leading global manufacturers supplying such systems, presenting an analysis of their key operational features, strengths, and weaknesses. Various architectures used in the development of access control software are also considered. These architectures generally fall into two categories: monolithic and microservice-based.</em></p> <p class="06AnnotationVKNUES"><em>Monolithic architectures, though simpler and more straightforward, can pose scalability challenges, whereas microservice-based architectures allow for greater flexibility and modularity. The article explores the algorithms underlying these architectures, their respective benefits and drawbacks, and provides a comparative analysis of their effectiveness in real-world applications.</em></p>Yuriy FORKUNDenys MAKARYSHKINVladyslav ANTONYUKVitalii LIUBCHYK
Copyright (c) 2025 Юрій ФОРКУН, Денис МАКАРИШКІН, Владислав АНТОНЮК, Віталій ЛЮБЧИК
https://creativecommons.org/licenses/by/4.0
2025-03-252025-03-25125626610.31891/2219-9365-2025-81-32METHODOLOGICAL APPROACH TO COMPREHENSIVE IDENTIFICATION AND ANALYSIS OF CYBERTHREATS IN TRAFFIC IN 5G/IMT-2020 TELECOMMUNICATION NETWORKS BASED ON ARTIFICIAL INTELLIGENCE TECHNOLOGIES
https://vottp.khmnu.edu.ua/index.php/vottp/article/view/476
<p class="06AnnotationVKNUES"><em>The paper analyzes the process of identifying and analyzing cyberthreats of incoming traffic in 5G/IMT-2020 networks built using Ultra-reliable and low latency communications technology, identifies its features and research directions for increasing the efficiency and monitoring of traffic and analyzing cyberthreats. To solve the problem of identifying traffic and analyzing cyberthreats in the 5G/IMT-2020 network, the paper develops and presents an appropriate methodological approach. The specified methodological approach includes formation of metadata arrays of the incoming flow of useful data and cyberattack data, modification of them into a set of training data, formation of a training software and hardware complex and development of the neural network structure, carrying out the process of training the neural network and implementing it in the process of traffic identification and analysis of cyber threats in 5G/IMT-2020 telecommunication networks.</em></p> <p class="06AnnotationVKNUES"><em>Evaluation of the results of the training process of the proposed neural network and verification of its operation on test data sets in the trained state showed that the neural network presented in the work is able to monitor and identify traffic generated from Internet of Things services with a probability of up to 99.7%. In the process of monitoring and identifying traffic from two or more services, this probability may decrease, but is within the permissible limits of 80-90%.</em></p>Oleksandr TUROVSKYMykola RYZHAKOV
Copyright (c) 2025 Олександр ТУРОВСЬКИЙ, Микола РИЖАКОВ
https://creativecommons.org/licenses/by/4.0
2025-03-252025-03-25126727710.31891/2219-9365-2025-81-33