期刊文献+
共找到6篇文章
< 1 >
每页显示 20 50 100
Computing Power Network:A Survey 被引量:2
1
作者 Sun Yukun Lei Bo +4 位作者 Liu Junlin Huang Haonan Zhang Xing Peng Jing Wang Wenbo 《China Communications》 SCIE CSCD 2024年第9期109-145,共37页
With the rapid development of cloud computing,edge computing,and smart devices,computing power resources indicate a trend of ubiquitous deployment.The traditional network architecture cannot efficiently leverage these... With the rapid development of cloud computing,edge computing,and smart devices,computing power resources indicate a trend of ubiquitous deployment.The traditional network architecture cannot efficiently leverage these distributed computing power resources due to computing power island effect.To overcome these problems and improve network efficiency,a new network computing paradigm is proposed,i.e.,Computing Power Network(CPN).Computing power network can connect ubiquitous and heterogenous computing power resources through networking to realize computing power scheduling flexibly.In this survey,we make an exhaustive review on the state-of-the-art research efforts on computing power network.We first give an overview of computing power network,including definition,architecture,and advantages.Next,a comprehensive elaboration of issues on computing power modeling,information awareness and announcement,resource allocation,network forwarding,computing power transaction platform and resource orchestration platform is presented.The computing power network testbed is built and evaluated.The applications and use cases in computing power network are discussed.Then,the key enabling technologies for computing power network are introduced.Finally,open challenges and future research directions are presented as well. 展开更多
关键词 computing power modeling computing power network computing power scheduling information awareness network forwarding
在线阅读 下载PDF
Improving throughput in SWIPT-based wireless multirelay networks with relay selection and rateless codes
2
作者 Gaofei Huang Qihong Zhong +2 位作者 Hui Zheng Sai Zhao Dong Tang 《Digital Communications and Networks》 SCIE CSCD 2024年第4期1131-1144,共14页
This paper studies a dual-hop Simultaneous Wireless Information and Power Transfer(SWIPT)-based multi-relay network with a direct link.To achieve high throughput in the network,a novel protocol is first developed,in w... This paper studies a dual-hop Simultaneous Wireless Information and Power Transfer(SWIPT)-based multi-relay network with a direct link.To achieve high throughput in the network,a novel protocol is first developed,in which the network can switch between a direct transmission mode and a Single-Relay-Selection-based Cooperative Transmission(SRS-CT)mode that employs dynamic decode-and-forward relaying accomplished with Rateless Codes(RCs).Then,under this protocol,an optimization problem is formulated to jointly optimize the network operation mode and the resource allocation in the SRS-CT mode.The formulated problem is difficult to solve because not only does the noncausal Channel State Information(CSI)cause the problem to be stochastic,but also the energy state evolution at each relay is complicated by network operation mode decision and resource allocation.Assuming that noncausal CSI is available,the stochastic optimization issue is first to be addressed by solving an involved deterministic optimization problem via dynamic programming,where the complicated energy state evolution issue is addressed by a layered optimization method.Then,based on a finite-state Markov channel model and assuming that CSI statistical properties are known,the stochastic optimization problem is solved by extending the result derived for the noncausal CSI case to the causal CSI case.Finally,a myopic strategy is proposed to achieve a tradeoff between complexity and performance without the knowledge of CSI statistical properties.The simulation results verify that our proposed SRS-and-RC-based design can achieve a maximum of approximately 40%throughput gain over a simple SRS-and-RC-based baseline scheme in SWIPT-based multi-relay networks. 展开更多
关键词 Simultaneous wireless information and power TRANSFER Energy harvesting Relay networks Throughput maximization
在线阅读 下载PDF
Modified Computational Ranking Model for Cloud Trust Factor Using Fuzzy Logic
3
作者 Lei Shen Ting Huang +1 位作者 Nishui Cai Hao Wu 《Intelligent Automation & Soft Computing》 SCIE 2023年第7期507-524,共18页
Through the use of the internet and cloud computing,users may access their data as well as the programmes they have installed.It is now more challenging than ever before to choose which cloud service providers to take... Through the use of the internet and cloud computing,users may access their data as well as the programmes they have installed.It is now more challenging than ever before to choose which cloud service providers to take advantage of.When it comes to the dependability of the cloud infrastructure service,those who supply cloud services,as well as those who seek cloud services,have an equal responsibility to exercise utmost care.Because of this,further caution is required to ensure that the appropriate values are reached in light of the ever-increasing need for correct decision-making.The purpose of this study is to provide an updated computational ranking approach for decision-making in an environment with many criteria by using fuzzy logic in the context of a public cloud scenario.This improved computational ranking system is also sometimes referred to as the improvised VlseKriterijumska Optimizacija I Kompromisno Resenje(VIKOR)method.It gives users access to a trustworthy assortment of cloud services that fit their needs.The activity that is part of the suggested technique has been broken down into nine discrete parts for your convenience.To verify these stages,a numerical example has been evaluated for each of the six different scenarios,and the outcomes have been simulated. 展开更多
关键词 CLOUD TRUST computational ranking VIKOR fuzzy
在线阅读 下载PDF
Communication efficiency optimization of federated learning for computing and network convergence of 6G networks
4
作者 Yizhuo CAI Bo LEI +4 位作者 Qianying ZHAO Jing PENG Min WEI Yushun ZHANG Xing ZHANG 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2024年第5期713-727,共15页
Federated learning effectively addresses issues such as data privacy by collaborating across participating devices to train global models.However,factors such as network topology and computing power of devices can aff... Federated learning effectively addresses issues such as data privacy by collaborating across participating devices to train global models.However,factors such as network topology and computing power of devices can affect its training or communication process in complex network environments.Computing and network convergence(CNC)of sixth-generation(6G)networks,a new network architecture and paradigm with computing-measurable,perceptible,distributable,dispatchable,and manageable capabilities,can effectively support federated learning training and improve its communication efficiency.By guiding the participating devices'training in federated learning based on business requirements,resource load,network conditions,and computing power of devices,CNC can reach this goal.In this paper,to improve the communication eficiency of federated learning in complex networks,we study the communication eficiency optimization methods of federated learning for CNC of 6G networks that give decisions on the training process for different network conditions and computing power of participating devices.The simulations address two architectures that exist for devices in federated learning and arrange devices to participate in training based on arithmetic power while achieving optimization of communication efficiency in the process of transferring model parameters.The results show that the methods we proposed can cope well with complex network situations,effectively balance the delay distribution of participating devices for local training,improve the communication eficiency during the transfer of model parameters,and improve the resource utilization in the network. 展开更多
关键词 Computing and network convergence Communication efficiency Federated learning Two architectures
原文传递
Adaptivemulti-layer deployment for a digital-twinempowered satellite-terrestrial integrated network
5
作者 Yihong TAO Bo LEI +2 位作者 Haoyang SHI Jingkai CHEN Xing ZHANG 《Frontiers of Information Technology & Electronic Engineering》 2025年第2期246-259,共14页
With the development of satellite communication technology,satellite-terrestrial integrated networks(STINs),which integrate satellite networks and ground networks,can realize global seamless coverage of communication ... With the development of satellite communication technology,satellite-terrestrial integrated networks(STINs),which integrate satellite networks and ground networks,can realize global seamless coverage of communication services.Confronting the intricacies of network dynamics,the resource heterogeneity,and the unpredictability of user mobility,dynamic resource allocation within networks faces formidable challenges.Digital twin(DT),as a new technique,can reflect a physical network to a virtual network to monitor,analyze,and optimize the physical networks.Nevertheless,in the process of constructing a DT model,the deployment location and resource allocation of DTs may adversely affect its performance.Therefore,we propose a STIN model,which alleviates the problem of insufficient single-layer deployment flexibility of the traditional edge network by deploying DTs in multi-layer nodes in a STIN.To address the challenge of deploying DTs in the network,we propose a multi-layer DT deployment problem in the STIN to reduce system delay.Then we adopt a multi-agent reinforcement learning(MARL)scheme to explore the optimal strategy of the DT multi-layer deployment problem.The implemented scheme demonstrates a notable reduction in system delay,as evidenced by simulation outcomes. 展开更多
关键词 Digital twin Satellite-terrestrial integrated network DEPLOYMENT Multi-agent reinforcement learning
原文传递
Semantic separator learning and its applications in unsupervised Chinese text parsing
6
作者 Yuming WU Xiaodong LUO Zhen YANG 《Frontiers of Computer Science》 SCIE EI CSCD 2013年第1期55-68,共14页
Grammar learning has been a bottleneck problem for a long time. In this paper, we propose a method of seman- tic separator learning, a special case of grammar learning. The method is based on the hypothesis that some ... Grammar learning has been a bottleneck problem for a long time. In this paper, we propose a method of seman- tic separator learning, a special case of grammar learning. The method is based on the hypothesis that some classes of words, called semantic separators, split a sentence into sev- eral constituents. The semantic separators are represented by words together with their part-of-speech tags and other infor- mation so that rich semantic information can be involved. In the method, we first identify the semantic separators with the help of noun phrase boundaries, called subseparators. Next, the argument classes of the separators are learned from cor- pus by generalizing argument instances in a hypernym space. Finally, in order to evaluate the learned semantic separators, we use them in unsupervised Chinese text parsing. The exper- iments on a manually labeled test set show that the proposed method outperforms previous methods of unsupervised text parsing. 展开更多
关键词 semantic separator separator learning unsuper vised text parsing
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部