In 2023,pivotal advancements in artificial intelligence(AI)have significantly experienced.With that in mind,traditional methodologies,notably the p-y approach,have struggled to accurately model the complex,nonlinear s...In 2023,pivotal advancements in artificial intelligence(AI)have significantly experienced.With that in mind,traditional methodologies,notably the p-y approach,have struggled to accurately model the complex,nonlinear soil-structure interactions of laterally loaded large-diameter drilled shafts.This study undertakes a rigorous evaluation of machine learning(ML)and deep learning(DL)techniques,offering a comprehensive review of their application in addressing this geotechnical challenge.A thorough review and comparative analysis have been carried out to investigate various AI models such as artificial neural networks(ANNs),relevance vector machines(RVMs),and least squares support vector machines(LSSVMs).It was found that despite ML approaches outperforming classic methods in predicting the lateral behavior of piles,their‘black box'nature and reliance only on a data-driven approach made their results showcase statistical robustness rather than clear geotechnical insights,a fact underscored by the mathematical equations derived from these studies.Furthermore,the research identified a gap in the availability of drilled shaft datasets,limiting the extendibility of current findings to large-diameter piles.An extensive dataset,compiled from a series of lateral loading tests on free-head drilled shaft with varying properties and geometries,was introduced to bridge this gap.The paper concluded with a direction for future research,proposes the integration of physics-informed neural networks(PINNs),combining data-driven models with fundamental geotechnical principles to improve both the interpretability and predictive accuracy of AI applications in geotechnical engineering,marking a novel contribution to the field.展开更多
In recent years, Digital Twin (DT) has gained significant interestfrom academia and industry due to the advanced in information technology,communication systems, Artificial Intelligence (AI), Cloud Computing (CC),and ...In recent years, Digital Twin (DT) has gained significant interestfrom academia and industry due to the advanced in information technology,communication systems, Artificial Intelligence (AI), Cloud Computing (CC),and Industrial Internet of Things (IIoT). The main concept of the DT isto provide a comprehensive tangible, and operational explanation of anyelement, asset, or system. However, it is an extremely dynamic taxonomydeveloping in complexity during the life cycle that produces a massive amountof engendered data and information. Likewise, with the development of AI,digital twins can be redefined and could be a crucial approach to aid theInternet of Things (IoT)-based DT applications for transferring the data andvalue onto the Internet with better decision-making. Therefore, this paperintroduces an efficient DT-based fault diagnosis model based on machinelearning (ML) tools. In this framework, the DT model of the machine isconstructed by creating the simulation model. In the proposed framework,the Genetic algorithm (GA) is used for the optimization task to improvethe classification accuracy. Furthermore, we evaluate the proposed faultdiagnosis framework using performance metrics such as precision, accuracy,F-measure, and recall. The proposed framework is comprehensively examinedusing the triplex pump fault diagnosis. The experimental results demonstratedthat the hybrid GA-ML method gives outstanding results compared to MLmethods like LogisticRegression (LR), Na飗e Bayes (NB), and SupportVectorMachine (SVM). The suggested framework achieves the highest accuracyof 95% for the employed hybrid GA-SVM. The proposed framework willeffectively help industrial operators make an appropriate decision concerningthe fault analysis for IIoT applications in the context of Industry 4.0.展开更多
Recently,the Internet of Things(IoT)has been used in various applications such as manufacturing,transportation,agriculture,and healthcare that can enhance efficiency and productivity via an intelligent management cons...Recently,the Internet of Things(IoT)has been used in various applications such as manufacturing,transportation,agriculture,and healthcare that can enhance efficiency and productivity via an intelligent management console remotely.With the increased use of Industrial IoT(IIoT)applications,the risk of brutal cyber-attacks also increased.This leads researchers worldwide to work on developing effective Intrusion Detection Systems(IDS)for IoT infrastructure against any malicious activities.Therefore,this paper provides effective IDS to detect and classify unpredicted and unpredictable severe attacks in contradiction to the IoT infrastructure.A comprehensive evaluation examined on a new available benchmark TON_IoT dataset is introduced.The data-driven IoT/IIoT dataset incorporates a label feature indicating classes of normal and attack-targeting IoT/IIoT applications.Correspondingly,this data involves IoT/IIoT services-based telemetry data that involves operating systems logs and IoT-based traffic networks collected from a realistic medium-scale IoT network.This is to classify and recognize the intrusion activity and provide the intrusion detection objectives in IoT environments in an efficient fashion.Therefore,several machine learning algorithms such as Logistic Regression(LR),Linear Discriminant Analysis(LDA),K-Nearest Neighbors(KNN),Gaussian Naive Bayes(NB),Classification and Regression Tree(CART),Random Forest(RF),and AdaBoost(AB)are used for the detection intent on thirteen different intrusion datasets.Several performance metrics like accuracy,precision,recall,and F1-score are used to estimate the proposed framework.The experimental results show that the CART surpasses the other algorithms with the highest accuracy values like 0.97,1.00,0.99,0.99,1.00,1.00,and 1.00 for effectively detecting the intrusion activities on the IoT/IIoT infrastructure on most of the employed datasets.In addition,the proposed work accomplishes high performance compared to other recent related works in terms of different security and detection evaluation parameters.展开更多
Network management and multimedia data mining techniques have a great interest in analyzing and improving the network traffic process.In recent times,the most complex task in Software Defined Network(SDN)is security,w...Network management and multimedia data mining techniques have a great interest in analyzing and improving the network traffic process.In recent times,the most complex task in Software Defined Network(SDN)is security,which is based on a centralized,programmable controller.Therefore,monitoring network traffic is significant for identifying and revealing intrusion abnormalities in the SDN environment.Consequently,this paper provides an extensive analysis and investigation of the NSL-KDD dataset using five different clustering algorithms:K-means,Farthest First,Canopy,Density-based algorithm,and Exception-maximization(EM),using the Waikato Environment for Knowledge Analysis(WEKA)software to compare extensively between these five algorithms.Furthermore,this paper presents an SDN-based intrusion detection system using a deep learning(DL)model with the KDD(Knowledge Discovery in Databases)dataset.First,the utilized dataset is clustered into normal and four major attack categories via the clustering process.Then,a deep learning method is projected for building an efficient SDN-based intrusion detection system.The results provide a comprehensive analysis and a flawless reasonable study of different kinds of attacks incorporated in the KDD dataset.Similarly,the outcomes reveal that the proposed deep learning method provides efficient intrusion detection performance compared to existing techniques.For example,the proposed method achieves a detection accuracy of 94.21%for the examined dataset.展开更多
During the COVID-19 crisis, the need to stay at home has increased dramatically. In addition, the number of sickpeople, especially elderly persons, has increased exponentially. In such a scenario, home monitoring of p...During the COVID-19 crisis, the need to stay at home has increased dramatically. In addition, the number of sickpeople, especially elderly persons, has increased exponentially. In such a scenario, home monitoring of patientscan ensure remote healthcare at home using advanced technologies such as the Internet of Medical Things (IoMT).The IoMT can monitor and transmit sensitive health data;however, it may be vulnerable to various attacks. In thispaper, an efficient healthcare security system is proposed for IoMT applications. In the proposed system, themedical sensors can transmit sensed encrypted health data via a mobile application to the doctor for privacy.Then, three consortium blockchains are constructed for load balancing of transactions and reducing transactionlatency. They store the credentials of system entities, doctors' prescriptions and recommendations according to thedata transmitted via mobile applications, and the medical treatment process. Besides, cancelable biometrics areused for providing authentication and increasing the security of the proposed medical system. The investigationalresults show that the proposed system outperforms existing work where the proposed model consumed lessprocessing time by values of 18%, 22%, and 40%, and less energy for processing a 200 KB file by values of 9%,13%, and 17%. Finally, the proposed model consumed less memory usage by values of 7%, 7%, and 18.75%. Fromthese results, it is clear that the proposed system gives a very reliable and secure performance for efficientlysecuring medical applications.展开更多
基金supported by Prince Sultan University(Grant No.PSU-CE-TECH-135,2023).
文摘In 2023,pivotal advancements in artificial intelligence(AI)have significantly experienced.With that in mind,traditional methodologies,notably the p-y approach,have struggled to accurately model the complex,nonlinear soil-structure interactions of laterally loaded large-diameter drilled shafts.This study undertakes a rigorous evaluation of machine learning(ML)and deep learning(DL)techniques,offering a comprehensive review of their application in addressing this geotechnical challenge.A thorough review and comparative analysis have been carried out to investigate various AI models such as artificial neural networks(ANNs),relevance vector machines(RVMs),and least squares support vector machines(LSSVMs).It was found that despite ML approaches outperforming classic methods in predicting the lateral behavior of piles,their‘black box'nature and reliance only on a data-driven approach made their results showcase statistical robustness rather than clear geotechnical insights,a fact underscored by the mathematical equations derived from these studies.Furthermore,the research identified a gap in the availability of drilled shaft datasets,limiting the extendibility of current findings to large-diameter piles.An extensive dataset,compiled from a series of lateral loading tests on free-head drilled shaft with varying properties and geometries,was introduced to bridge this gap.The paper concluded with a direction for future research,proposes the integration of physics-informed neural networks(PINNs),combining data-driven models with fundamental geotechnical principles to improve both the interpretability and predictive accuracy of AI applications in geotechnical engineering,marking a novel contribution to the field.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project Number (PNURSP2022R197),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘In recent years, Digital Twin (DT) has gained significant interestfrom academia and industry due to the advanced in information technology,communication systems, Artificial Intelligence (AI), Cloud Computing (CC),and Industrial Internet of Things (IIoT). The main concept of the DT isto provide a comprehensive tangible, and operational explanation of anyelement, asset, or system. However, it is an extremely dynamic taxonomydeveloping in complexity during the life cycle that produces a massive amountof engendered data and information. Likewise, with the development of AI,digital twins can be redefined and could be a crucial approach to aid theInternet of Things (IoT)-based DT applications for transferring the data andvalue onto the Internet with better decision-making. Therefore, this paperintroduces an efficient DT-based fault diagnosis model based on machinelearning (ML) tools. In this framework, the DT model of the machine isconstructed by creating the simulation model. In the proposed framework,the Genetic algorithm (GA) is used for the optimization task to improvethe classification accuracy. Furthermore, we evaluate the proposed faultdiagnosis framework using performance metrics such as precision, accuracy,F-measure, and recall. The proposed framework is comprehensively examinedusing the triplex pump fault diagnosis. The experimental results demonstratedthat the hybrid GA-ML method gives outstanding results compared to MLmethods like LogisticRegression (LR), Na飗e Bayes (NB), and SupportVectorMachine (SVM). The suggested framework achieves the highest accuracyof 95% for the employed hybrid GA-SVM. The proposed framework willeffectively help industrial operators make an appropriate decision concerningthe fault analysis for IIoT applications in the context of Industry 4.0.
文摘Recently,the Internet of Things(IoT)has been used in various applications such as manufacturing,transportation,agriculture,and healthcare that can enhance efficiency and productivity via an intelligent management console remotely.With the increased use of Industrial IoT(IIoT)applications,the risk of brutal cyber-attacks also increased.This leads researchers worldwide to work on developing effective Intrusion Detection Systems(IDS)for IoT infrastructure against any malicious activities.Therefore,this paper provides effective IDS to detect and classify unpredicted and unpredictable severe attacks in contradiction to the IoT infrastructure.A comprehensive evaluation examined on a new available benchmark TON_IoT dataset is introduced.The data-driven IoT/IIoT dataset incorporates a label feature indicating classes of normal and attack-targeting IoT/IIoT applications.Correspondingly,this data involves IoT/IIoT services-based telemetry data that involves operating systems logs and IoT-based traffic networks collected from a realistic medium-scale IoT network.This is to classify and recognize the intrusion activity and provide the intrusion detection objectives in IoT environments in an efficient fashion.Therefore,several machine learning algorithms such as Logistic Regression(LR),Linear Discriminant Analysis(LDA),K-Nearest Neighbors(KNN),Gaussian Naive Bayes(NB),Classification and Regression Tree(CART),Random Forest(RF),and AdaBoost(AB)are used for the detection intent on thirteen different intrusion datasets.Several performance metrics like accuracy,precision,recall,and F1-score are used to estimate the proposed framework.The experimental results show that the CART surpasses the other algorithms with the highest accuracy values like 0.97,1.00,0.99,0.99,1.00,1.00,and 1.00 for effectively detecting the intrusion activities on the IoT/IIoT infrastructure on most of the employed datasets.In addition,the proposed work accomplishes high performance compared to other recent related works in terms of different security and detection evaluation parameters.
文摘Network management and multimedia data mining techniques have a great interest in analyzing and improving the network traffic process.In recent times,the most complex task in Software Defined Network(SDN)is security,which is based on a centralized,programmable controller.Therefore,monitoring network traffic is significant for identifying and revealing intrusion abnormalities in the SDN environment.Consequently,this paper provides an extensive analysis and investigation of the NSL-KDD dataset using five different clustering algorithms:K-means,Farthest First,Canopy,Density-based algorithm,and Exception-maximization(EM),using the Waikato Environment for Knowledge Analysis(WEKA)software to compare extensively between these five algorithms.Furthermore,this paper presents an SDN-based intrusion detection system using a deep learning(DL)model with the KDD(Knowledge Discovery in Databases)dataset.First,the utilized dataset is clustered into normal and four major attack categories via the clustering process.Then,a deep learning method is projected for building an efficient SDN-based intrusion detection system.The results provide a comprehensive analysis and a flawless reasonable study of different kinds of attacks incorporated in the KDD dataset.Similarly,the outcomes reveal that the proposed deep learning method provides efficient intrusion detection performance compared to existing techniques.For example,the proposed method achieves a detection accuracy of 94.21%for the examined dataset.
文摘During the COVID-19 crisis, the need to stay at home has increased dramatically. In addition, the number of sickpeople, especially elderly persons, has increased exponentially. In such a scenario, home monitoring of patientscan ensure remote healthcare at home using advanced technologies such as the Internet of Medical Things (IoMT).The IoMT can monitor and transmit sensitive health data;however, it may be vulnerable to various attacks. In thispaper, an efficient healthcare security system is proposed for IoMT applications. In the proposed system, themedical sensors can transmit sensed encrypted health data via a mobile application to the doctor for privacy.Then, three consortium blockchains are constructed for load balancing of transactions and reducing transactionlatency. They store the credentials of system entities, doctors' prescriptions and recommendations according to thedata transmitted via mobile applications, and the medical treatment process. Besides, cancelable biometrics areused for providing authentication and increasing the security of the proposed medical system. The investigationalresults show that the proposed system outperforms existing work where the proposed model consumed lessprocessing time by values of 18%, 22%, and 40%, and less energy for processing a 200 KB file by values of 9%,13%, and 17%. Finally, the proposed model consumed less memory usage by values of 7%, 7%, and 18.75%. Fromthese results, it is clear that the proposed system gives a very reliable and secure performance for efficientlysecuring medical applications.