PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318501 (2024) https://doi.org/10.1117/12.3040033
This PDF file contains the front matter associated with SPIE Proceedings Volume 13185, including the Title Page, Copyright information, Table of Contents, and Conference Committee information.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Youjun Bu, Han Wang, Ruilong Chen, Qiao Zhang, Xianjun Hu
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318502 (2024) https://doi.org/10.1117/12.3032737
With the rapid development of information technology, many vulnerabilities have caused great security risks to information systems. These security vulnerabilities have become gateways and pathways for the propagation of malicious software. To uncover security threats within information systems and assess their severity, engineers periodically conduct penetration testing on the systems and implement security enhancements based on the results of the penetration tests. In order to reduce manual involvement in the penetration testing process and lessen the reliance on specialized skills during the testing process, the paper design an automated security penetration testing system. The system establishes a penetration testing platform, combining manual and automated methods to evaluate the security of information systems, which provides an in-depth assessment of the security of the target system. Based on this novel platform, a penetration test experiment for mimicry structure information system is carried out. The experimental results show that the automatic safe penetration testing system designed in this paper realizes the automatic calling and execution of tools, and improves the efficiency and accuracy of the penetration testing process. The system designed in this paper has good value of promotion and application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318503 (2024) https://doi.org/10.1117/12.3032744
In recent years, the prevalence rate of laryngeal cancer is gradually rising, and the best treatment at present lies in early diagnosis and prevention. Early laryngeal cancer is difficult to be detected, and imaging examination is one of the important methods for diagnosis and prevention of laryngeal cancer. However, due to the particularity of medical images, the human eye is often difficult to pay attention to all characteristic information, and doctors lacking experience in endoscopy are prone to misjudgment. Using deep learning to build a computer-aided diagnosis system for laryngeal diseases is an effective solution. At present, semi-supervised learning has become a research hotspot in the field of medical image classification. The intent of this study is to construct and evaluate the performance of a technique based on self-training and consistent regularization for the classification of laryngeal disease images. Our dataset contains a total of 4000 laryngeal images, including 3000 normal images and 1000 laryngeal cancer images. We divide the sample set into two parts: learning samples and validation samples. The distribution ratio is 8:2. Thirty percent of the samples were labeled and seventy percent were unlabeled. Experimental data show that the classification accuracy of the model can reach 80.125%. The classifier is helpful to develop a computer-aided diagnosis system for laryngeal diseases and can effectively decrease the annotation cost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318504 (2024) https://doi.org/10.1117/12.3032790
In response to the problems of low efficiency and high time delay in current enterprise financial report risk management evaluation, this article constructs an evaluation model based on machine learning technology. The paper obtained corporate financial data from public financial statements, financial databases, etc., conducted data preprocessing and feature extraction, and used information gain techniques to filter financial statement indicators that have a significant impact on corporate financial risk. It also adopted Support Vector Machine (SVM) to build a prediction model, trained the model using historical financial data, optimizing model parameters with algorithms like gradient descent; finally, the model's performance was evaluated. As the corporate data amount exceeds 1TB, the processing time is only 590.8 seconds, suggesting that the model provides accurate corporate financial risk assessments with short response latency.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318505 (2024) https://doi.org/10.1117/12.3032801
In response to the fact that big data analysis platforms often only focus on certain specific aspects of high availability and fault tolerance, and their availability and fault tolerance are relatively low, this article takes power data as the object, combines cloud computing to build a big data analysis platform, and conducts research on the optimization design of platform availability and fault tolerance. Firstly, relevant experimental data such as equipment status data and power supply network data of a certain power company can be collected. Then, in order to meet the needs of high availability and fault tolerance of the platform, cloud computing multi area deployment and elastic resource management can be adopted, and load balancing technology can be used to evenly allocate requests to multiple nodes. By utilizing fault-tolerant routing mechanisms, faulty nodes can be detected and bypassed in a timely manner. Finally, a power big data analysis platform that integrates cloud computing can be constructed, and the high availability and fault tolerance of the experimental platform can be verified. The experimental results show that the platform for cloud computing based big data analysis optimization only takes 0.68 seconds to recover from failures, which is 1.27 seconds less than traditional big data analysis platforms. The platform data integrity is as high as 98.60%, indicating that the experimental platform has high availability and fault tolerance.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318506 (2024) https://doi.org/10.1117/12.3033049
Aiming at the problem of insufficient representation of single feature information in academic papers, this paper combines text features with graph structure information in academic charts to capture the correlation and influence among academic papers more comprehensively, so as to improve the accuracy and richness of feature representation. In this method, we first use pre-trained ERNIE large model to obtain the initial representation of the paper information, then use DPCNN and Bi-LSTM to extract deeper textual features of the paper, and then construct the academic map, and design the restart random walk method guided by metapath and Skip-gram to extract the feature representation of nodes in the academic map. The paper text features and graph node features are fused by the attention mechanism to get the final feature representation. The effectiveness of the proposed method is verified by text classification task, and the classification results are predicted by SoftMax layer. The experimental results show that the accuracy rate, recall rate and F1 value of the proposed model are 86.2%, 88.5% and 87.3%, respectively, which are superior to other single semantic methods, which can help researchers obtain relevant information faster and more accurately, improve academic research efficiency, and provide effective support for automatic paper classification, recommendation system and text mining. It has certain application prospect and practical value.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318507 (2024) https://doi.org/10.1117/12.3033073
With the deep integration of big data in the automotive industry, and the increasingly prominent position of big data as a strategic resource, the value of data as a key element becomes more and more important, data circulation, data quality, data security and other issues are attracting more and more attention. For automobile enterprises, with the rapid growth of massive, multi-source and heterogeneous automobile big data, how to promote the efficient use of data and promote the digital transformation of enterprises through data governance is the current research focus. This paper uses "data5W2H analysis model" to summarize and analyze how automobile enterprises carry out systematic data governance, and proposes the construction of automobile big data governance system from four aspects of data governance activities, organizational structure, institutional process and platform building. Based on the comprehensive inventory of the status quo of data assets, unified data standards and norms are adopted to improve the quality of data, ensure data security, make the data from disorderly to standardized and orderly.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318508 (2024) https://doi.org/10.1117/12.3033311
Aiming at the problems of high noise interference, low signal-to-noise ratio and low resolution often faced by pathological classification of prostate ultrasound images, a complete dataset of prostate ultrasound image needle tract tissues available for AI was constructed by taking prostate ultrasound image puncture needle tract tissues as an example. The Mask R-CNN is used to achieve the segmentation of prostate ultrasound image puncture needle tract tissue, and the established needle tract segmentation model can automatically identify the parts of prostate tissue that may be cancer. Then, the background information is masked and the prostate ultrasound image needle tract tissue dataset is constructed for use in each classical image classification detection network model. Comparing the four mainstream comparative learning methods, the performance metrics such as Precision, Recall, and F1-score are all higher than directly using the original image classification. The constructed image needle channel dataset has higher accuracy in each classical image classification detection can effectively improve the efficiency of prostate cancer diagnosis work, and promote the development of prostate cancer ultrasound image diagnosis technology.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 1318509 (2024) https://doi.org/10.1117/12.3033375
The data center uses virtualization and isolation technologies to provide flexible and efficient services for multi-tenants. One of the most challenging aspect of resource sharing is task scheduling. During the scheduling process, it is crucial to ensure fairness in user resource usage and achieve high cluster utilization and energy efficiency. However, the heterogeneity of resources and the variations in user demands make it extremely difficult to provide an effective scheduling solution. In this paper, we propose an efficient heuristic scheduling algorithm called SAUFEE, which trades off the resource requirement of multi-tenants and cluster power consumption. First, we introduce a user fairness model, which prioritizes the tasks of users with the least resource allocation in each scheduling round, ensuring fairness among them. Next, we propose a resource utilization model to schedule user tasks to reduce resource waste. Additionally, idle machines are shut down to save overall cluster energy consumption. The simulation experiment results show that our algorithm increases the number of running tasks by 3.3% and the CPU utilization by 3.4% while ensuring fairness. Our algorithm plays an important role in improving cluster energy efficiency and user fairness.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850A (2024) https://doi.org/10.1117/12.3033521
In an era marked by unprecedented challenges and opportunities, the intersection of artificial intelligence (AI) and sustainable leadership has become a focal point of innovation and change. The review delves into the dynamic realm where cutting-edge technology meets the timeless principles of responsible and impactful leadership. In this thought provoking exploration, this article navigate the profound implications, strategies, and potential outcomes of harnessing AI to empower leaders in steering organizations toward a more sustainable, equitable, and prosperous future. This review unveils the synergy of AI and sustainable leadership, forging a path toward transformative progress in an ever evolving world.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850B (2024) https://doi.org/10.1117/12.3033555
Facing the challenge of English verb inflection classification, a technique combining K-means clustering algorithm with Support Vector Machine (SVM) is proposed to enhance the efficiency and accuracy of classification. This study establishes a semantic structure model of English verb inflection and accurately describes its semantic attributes, aiming to capture the state features. Quantification of similarity between these state feature quantities using cosine similarity evaluation method is performed. Experimental results demonstrate a high level of accuracy in calculating the similarity of state feature quantities, effectively classifying both regular and irregular changes in verb inflection. By setting the positive slack factor and error penalty parameters to 3000 and 9, respectively, further optimization of classification results is achieved, indicating potential application in the field. To ensure scientific rigor and authority, extensive citations from relevant literature are provided, elucidating differences from existing technologies and their advantages. Strict adherence to APA Seventh Edition citation and formatting standards solidifies the paper's academic contribution with a robust literature foundation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Qiu Huang, Lan Gong, Shuqiong Li, Ping Huang, Jie Fan, Guotao Yang, Qing Wan
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850C (2024) https://doi.org/10.1117/12.3033609
On the premise of meeting the performance requirements of portal radiation alarm system in the national standard GB/T 24246-2009, a kind of portal radiation alarm system is developed by applying the computer method based on artificial intelligence to realize radioactive material detection in this paper. In order to detect low activity radioactive material close to natural background, the detection sensitivity measurement principle of this alarm system for low activity radioactive material is discussed. Through the background stability test of this alarm system, the theory that the background count rate meets 3 times standard deviation is obtained, which proves that the stability of this alarm system is reliable. On this basis, the detection experiment of low activity radioactive material is carried out. This alarm system can detect low activity radioactive material close to natural background. The result indicates that it is feasible to use this alarm system to detect low activity radioactive material, and that the diffusion of low activity radioactive material close to natural background due to missing detection can be avoided.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850D (2024) https://doi.org/10.1117/12.3033617
The classification of skin lesion images is challenging because to the substantial intra-class variability and mortality rate of skin cancer. Convolutional neural networks (CNNs) have recently been employed by specialists for dermatology-assisted diagnosis. We provide a lightweight convolutional neural network model for skin lesion classification to enhance the network model's precision in classifying skin lesions. We employ MobileNet-V2 as our backbone network, and the MobileNet-V2 block is built using convolution and an inverted residual block. To enhance the model's ability to depict pathological features, we add an improved interference coordinate attention block (RICA-block) to the inverted residual block. In order to limit the impact of complicated irrelevant backgrounds in lesion images on the classification performance of the model, we developed the RICA-block to perform distinct feature extraction for lesioned and non-lesioned regions of lesion images. The experimental findings demonstrate that, on the test datasets for ISIC2018 and ISIC2019, respectively, our model, which is just 3.32M, achieves classification accuracy of 89.52% and 93.24%, as well as 95% macro averages.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Communication Engineering and Next-Generation Networks
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850E (2024) https://doi.org/10.1117/12.3032335
The star earth collaboration network private network can make up for the problems of poor signal coverage, limited bandwidth, low security and high cost that exist in the existing wireless public network. It can provide reliable and stable power system communications. However, the star-earth collaborative network suffers from resource constraints due to limited on-planet spectrum resources, power, computation and storage resources. Therefore, how to allocate these limited channel resources reasonably and efficiently, it is especially important to improve the resource utilization and system performance. Based on this, we establish a star-earth collaborative network architecture to meet the application of power scenarios. To improve the resource allocation efficiency of this network architecture, we model the channel allocation process by sensing the channel allocation state and beam user service request state in the environment through the satellite intelligences of the star-earth collaboration network. And then, we propose a dynamic channel allocation algorithm based on deep learning. We design state, action and reward functions for feedback optimization of channel resource allocation strategy based on deep learning, and a deep learning based channel resource allocation method is proposed to realize the optimal allocation of system channel resources. In the proposed method, the corresponding channel resources for users is allocated based on the channel allocation policy, and environment-based reward gain information is given to optimally update the channel allocation policy. The related simulation results show that our proposed channel resource allocation algorithm for star-terrestrial collaborative network has lower service blocking rate and higher channel utilization than other assistant allocation algorithms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850F (2024) https://doi.org/10.1117/12.3032519
Aiming at the problems of uneven link load distribution and energy waste in power communication networks, this paper proposes a multi-objective genetic algorithm-based routing optimization strategy for power communication networks considering software-defined networks. The standard deviation of the remaining bandwidth of the whole network and the number of newly activated links, as quantitative indicators, are used to realize the dynamic allocation of power communication network resources. Simulation results show that the NSGA-II multi-objective genetic algorithm selected in this paper has better optimization effect than the shortest path Dijkstra algorithm, and this strategy can meet the increasing demand of service carrying capacity and make the link load distribution more uniform.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850G (2024) https://doi.org/10.1117/12.3032526
System logs are widely used in system management for reliability assurance and serve as crucial data sources for detecting system anomalies. Prior to anomaly detection, it is necessary to parse raw textual logs into structured logs that can be recognized by anomaly detection models. To address the inefficiency of existing log parsing algorithms, this study proposes a distributed system log parsing task allocation model and a heuristic-based log parsing algorithm. By combining the task allocation model with the parsing algorithm, the log parsing task is completed. Experimental results demonstrate the feasibility and effectiveness of the proposed model and algorithm, which can reduce parsing time and enhance parsing efficiency. Future research could explore log parsing methods based on deep learning and log parsing algorithms integrating natural language processing techniques to improve the accuracy and applicability of log parsing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850H (2024) https://doi.org/10.1117/12.3032677
The internal networks of cloud data centers are vast in scale, rich in data value, and high in traffic, imposing stringent demands on network security. Manual security operations for such network resources are highly resource-intensive and prone to errors. To address this, this paper proposes an autonomous network defense method for internal networks of cloud data centers based on reinforcement learning. We construct a leader-follower game model for network attack and defense, depicting the NP-hard interactive behavior in network attack and defense; we design a reinforcement learning defense agent that autonomously iterates learning, continuously feedback control, and adapts to environmental changes, training to solve for autonomous network defense strategies. Experiments demonstrate that our reinforcement learning agent acquires an excellent autonomous network defense capability, effectively resisting network attacks.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Lei Liang, Fuzhong Hao, Lei Qiao, Weijian Zhang, Qi Wang, Linhai Mu
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850I (2024) https://doi.org/10.1117/12.3032730
Aiming at the problem of poor link stability and routing detours caused by static updates of AODV minimum hop routing selection standard in high-speed mobile IoT, an improved AODV routing algorithm with polymorphic perception (AODV-PP) is proposed. The protocol uses cross-layer thought in route selection, which combines energy consumption, node stability and buffer congestion to calculate link cost and confirm route. The dynamic routing update is implemented by using source node sequence number and IP replacement cache mechanism. In order to reduce redundant links, a forwarding strategy based on sector area is proposed according to the location information of source node and destination node. Through simulation and comparison, this protocol can effectively reduce energy consumption, shorten the delay of network forwarding, improve the stability of high-speed mobile AD hoc network routing, and is more suitable for high-speed mobile Internet of Things application environment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Liang Han, Lin Zhang, Xiao Zhang, Shici Li, Haitao Xu
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850J (2024) https://doi.org/10.1117/12.3032754
In recent years, satellite communications have developed rapidly, which provides some new solutions for the power grid service while ground networks are unreachable or untrustworthy. In consideration of the intermittent, periodic, high reliability requirements, short single service, large total data, and other characteristics of power grid communication, in order to ensure the efficient and reliable transmission of important power grid information, a load monitoring of distribution terminal-based handover algorithm of power grid for satellite communication (LMSPG) is proposed. This algorithm takes the load status of the distribution terminal as one of the handover factors of the power communication network, solves the optimal handover trigger condition of power grid satellite communication by improved particle swarm optimization algorithm, and realizes the maximum effect of the system at the same time. The experimental simulation verifies that the proposed algorithm can improve the stability and reliability of power grid communication service transmission.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850K (2024) https://doi.org/10.1117/12.3032800
In this study, the transmission characteristics of 4~6 GHz incident microwave in weakly ionized dusty plasma generated by a coaxial hollow cathode glow discharge under helium discharge conditions are investigated based on theoretical models of conductivity and relative dielectric constant. The effects of different dusty particle density on the conductivity and relative dielectric constant of the dusty plasma are calculated. The relevant parameters of the dust plasma are then used in the analysis of microwave attenuation coefficient and phase coefficient formulas, and the experimental results are corrected and analyzed for errors. The results demonstrate a good agreement between the model and publicly reported experimental measurements in a specific laboratory dusty plasma environment, confirming the reliability of the theory. This study provides theoretical guidance for exploring the electromagnetic properties of laboratory dusty plasma and contributes to the improvement of the microwave attenuation measurement method for dusty plasma.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850L (2024) https://doi.org/10.1117/12.3032854
Because the complex network modeling link of traditional command information system is single, it can not reflect the heterogeneity of network link, nor can it qualitatively and quantitatively analyze the influence of electronic interference on the connectivity of command information system, a new command information system modeling method introduced into the military communication network. The communication network is divided into wired communication subnet and wireless communication subnet to solve the problem of link heterogeneity. Meanwhile, the influence of electronic interference on network connectivity can be qualitatively and quantitatively analyzed. The connectivity of command information system network attacked by random interference edge, random interference node and node degree attack is simulated and analyzed. The results show that this new mode of command information system modeling can qualitatively and quantitatively analyze the connectivity of command information system under soft weapon attack, which is more suitable for operational practice.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850M (2024) https://doi.org/10.1117/12.3033284
With the rapid advancement of artificial intelligence, quantum technology, and evolutionary networks 5G-A and 6G, the significance of mobile communication network security has become increasingly prominent. Cryptogram, as the fundamental element in ensuring network security, plays a crucial role in building the security capacity of mobile communication networks. However, the existing 5G network cryptography technology system is no longer able to fully meet the ever-changing security requirements. Therefore, it is necessary to accelerate the research on the enhancement of cryptographic technology applications in 5G networks and evolutionary networks, form a new quality of productivity for the integration of cryptogram and mobile communication networks, and enhance the security of 5G networks and evolutionary networks. This article focuses on enhancing cryptographic security technology for 5G and evolving networks, analyzing the feasibility of integrating cryptographic security enhancement technology, and exploring end-to-end cryptographic security enhancement methods from terminal devices to wireless transmission systems, core network infrastructure, and enterprise platforms. The topic of this article provides reference for the research and application of cryptographic security enhancement technology in subsequent networks.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Hongxi Zhou, Liang Han, Lin Zhang, Tong Lin, Han Zhang
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850N (2024) https://doi.org/10.1117/12.3033299
In this paper, the scenario of single satellite assisted emergency rescue network is proposed for power grid, in which the satellite moves to the position over the surviving node in the power grid, broadcasts information to the nodes in the coverage area through the downlink. The surviving node collects decodes the received signals, and then uploads the data information based through the uplink to the satellite. An algorithm is given by applying the Self-correcting Deep Q Network (SCDQN) algorithm in the Dueling Deep Q Network (DQN) network structure, and the dynamic resource allocation algorithm is proposed. In the proposed algorithm, the self-correcting DQN algorithm is selected as the base algorithm, which can realize self-correction. Both the estimated value network and the target value network adopt the Dueling DQN structure, which replaces the evaluation of the action value in the original DQN by evaluating the state value and the action value respectively. The improved algorithm can achieve efficient and stable convergence. The validity of the algorithm is verified through the simulation experiments and the superiority of the algorithm is verified through comparison.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Proceedings Volume International Conference on Communication, Information, and Digital Technologies (CIDT2024)
, 131850O (2024) https://doi.org/10.1117/12.3033519
Campus security supervision and information security early warning are crucial aspects propelling the advancement of smart campuses, exemplifying the integration of Internet and AI technologies in the realm of intelligent campus living. This study begins by addressing the challenge of network communication constraints in campus security system design. It examines the limitations of traditional technology-centric approaches in campus security systems and proposes a security system design strategy centered on Long Range Wide Area Network (LoRaWAN) technology. This includes the comprehensive architectural blueprint of intelligent security oversight systems and the establishment of intelligent campus security management and operation centered on garbage information identification in campus cultural development. Furthermore, feature algorithms are integrated into the software framework of the system to enhance the effectiveness of information security construction, thereby strengthening the system's operation under the auspices of LoRaWAN technology. Research findings highlight that the system, leveraging LoRaWAN technology, achieves a recognition accuracy exceeding 98% in garbage information identification, with an average recognition accuracy of 92.34% across various rotation angles. Data flow updates adhere to normative data refresh intervals, with an average response time of 1.65 seconds for diverse respective commands. The wireless communication paradigm supported by LoRaWAN effectively facilitates campus security oversight and data handling, providing innovative technical avenues and conceptual frameworks for campus security design paradigms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.