Ceramic relief mural is a contemporary landscape art that is carefully designed based on human nature,culture,and architectural wall space,combined with social customs,visual sensibility,and art.It may also become the...Ceramic relief mural is a contemporary landscape art that is carefully designed based on human nature,culture,and architectural wall space,combined with social customs,visual sensibility,and art.It may also become the main axis of ceramic art in the future.Taiwan public ceramic relief murals(PCRM)are most distinctive with the PCRM pioneered by Pan-Hsiung Chu of Meinong Kiln in 1987.In addition to breaking through the limitations of traditional public ceramic murals,Chu leveraged local culture and sensibility.The theme of art gives PCRM its unique style and innovative value throughout the Taiwan region.This study mainly analyzes and understands the design image of public ceramic murals,taking Taiwan PCRM’s design and creation as the scope,and applies STEEP analysis,that is,the social,technological,economic,ecological,and political-legal environments are analyzed as core factors;eight main important factors in the artistic design image of ceramic murals are evaluated.Then,interpretive structural modeling(ISM)is used to establish five levels,analyze the four main problems in the main core factor area and the four main target results in the affected factor area;and analyze the problem points and target points as well as their causal relationships.It is expected to sort out the relationship between these factors,obtain the hierarchical relationship of each factor,and provide a reference basis and research methods.展开更多
Alarm flood is one of the main problems in the alarm systems of industrial process. Alarm root-cause analysis and alarm prioritization are good for alarm flood reduction. This paper proposes a systematic rationalizati...Alarm flood is one of the main problems in the alarm systems of industrial process. Alarm root-cause analysis and alarm prioritization are good for alarm flood reduction. This paper proposes a systematic rationalization method for multivariate correlated alarms to realize the root cause analysis and alarm prioritization. An information fusion based interpretive structural model is constructed according to the data-driven partial correlation coefficient calculation and process knowledge modification. This hierarchical multi-layer model is helpful in abnormality propagation path identification and root-cause analysis. Revised Likert scale method is adopted to determine the alarm priority and reduce the blindness of alarm handling. As a case study, the Tennessee Eastman process is utilized to show the effectiveness and validity of proposed approach. Alarm system performance comparison shows that our rationalization methodology can reduce the alarm flood to some extent and improve the performance.展开更多
Through the collection of related literature,we point out the six major factors influencing China's forestry enterprises' financing: insufficient national support; regulations and institutional environmental f...Through the collection of related literature,we point out the six major factors influencing China's forestry enterprises' financing: insufficient national support; regulations and institutional environmental factors; narrow channels of financing; inappropriate existing mortgagebacked approach; forestry production characteristics; forestry enterprises' defects. Then,we use interpretive structural modeling( ISM) from System Engineering to analyze the structure of the six factors and set up ladder-type structure. We put three factors including forestry production characteristics,shortcomings of forestry enterprises and regulatory,institutional and environmental factors as basic factors and put other three factors as important factors. From the perspective of the government and enterprises,we put forward some personal advices and ideas based on the basic factors and important factors to ease the financing difficulties of forestry enterprises.展开更多
Background:Effective knowledge translation allows the optimisation of access to and utilisation of research knowledge in order to inform and enhance public health policy and practice.In low-and middle-income countries...Background:Effective knowledge translation allows the optimisation of access to and utilisation of research knowledge in order to inform and enhance public health policy and practice.In low-and middle-income countries,there are substantial complexities that affect the way in which research can be utilised for public health action.This review attempts to draw out concepts in the literature that contribute to defining some of the complexities and contextual factors that influence knowledge translation for public health in low-and middle-income countries.Methods:A Critical Interpretive Synthesis was undertaken,a method of analysis which allows a critical review of a wide range of heterogeneous evidence,through incorporating systematic review methods with qualitative enquiry techniques.A search for peer-reviewed articles published between 2000 and 2016 on the topic of knowledge translation for public health in low-and middle-income countries was carried out,and 85 articles were reviewed and analysed using this method.Results:Four main concepts were identified:1)tension between‘global’and‘local’health research,2)complexities in creating and accessing evidence,3)contextualising knowledge translation strategies for low-and middle-income countries,and 4)the unique role of non-government organisations in the knowledge translation process.Conclusion:This method of review has enabled the identification of key concepts that may inform practice or further research in the field of knowledge translation in low-and middle-income countries.展开更多
The possible risk factors during SAP Business One implementation were studied with depth interview. The results are then adjusted by experts. 20 categories of risk factors that are totally 49 factors were found. Based...The possible risk factors during SAP Business One implementation were studied with depth interview. The results are then adjusted by experts. 20 categories of risk factors that are totally 49 factors were found. Based on the risk factors during the SAP Business One implementation, questionnaire was used to study the key risk factors of SAP Business One implementation. Results illustrate ten key risk factors, these are risk of senior managers leadership, risk of project management, risk of process improvement, risk of implementation team organization, risk of process analysis, risk of based data, risk of personnel coordination, risk of change management, risk of secondary development, and risk of data import. Focus on the key risks of SAP Business One implementation, the interpretative structural modeling approach is used to study the relationship between these factors and establish a seven-level hierarchical structure. The study illustrates that the structure is olive-like, in which the risk of data import is on the top, and the risk of senior managers is on the bottom. They are the most important risk factors.展开更多
Interpretive theory brings forward three phases of interpretation: understanding, deverberlization and re-expression. It needs linguistic knowledge and non-linguistic knowledge. This essay discusses application of int...Interpretive theory brings forward three phases of interpretation: understanding, deverberlization and re-expression. It needs linguistic knowledge and non-linguistic knowledge. This essay discusses application of interpretive theory to business interpretation from the perspective of theory and practice.展开更多
The interpretive theory of translation(ITT) is a school of theory originated in the late 1960 s in France,focusing on the discussion of the theory and teaching of interpreting and non-literary translation. ITT believe...The interpretive theory of translation(ITT) is a school of theory originated in the late 1960 s in France,focusing on the discussion of the theory and teaching of interpreting and non-literary translation. ITT believes that what the translator should convey is not the meaning of linguistic notation,but the non-verbal sense. In this paper,the author is going to briefly introduce ITT and analyze several examples to show different situations where ITT is either useful or unsuitable.展开更多
This paper aims to explore teaching of interpreting nowadays by starting from the interpretive theory and its characteristics. The author believes that the theory is mainly based on the study of interpretation practic...This paper aims to explore teaching of interpreting nowadays by starting from the interpretive theory and its characteristics. The author believes that the theory is mainly based on the study of interpretation practice, whose core content, namely,"deverbalization"has made great strides and breakthroughs in the theory of translation; when we examine translation, or rather interpretation once again from the bi-perspective of language and culture, we will have come across new thoughts in terms of translation as well as teaching of interpreting.展开更多
Background:Population health interventions(PHIs)have the potential to improve the health of large populations by systematically addressing underlying conditions of poor health outcomes(i.e.,social determinants of heal...Background:Population health interventions(PHIs)have the potential to improve the health of large populations by systematically addressing underlying conditions of poor health outcomes(i.e.,social determinants of health)and reducing health inequities.Scaling-up may be one means of enhancing the impact of effective PHIs.However,not all scale-up attempts have been successful.In an attempt to help guide the process of successful scale-up of a PHI,we look to the organizational readiness for change theory for a new perspective on how we may better understand the scale-up pathway.Using the change theory,our goal was to develop the foundations of an evidence-based,theory-informed framework for a PHI,through a critical examination of various PHI scale-up experiences documented in the literature.Methods:We conducted a multi-step,critical interpretive synthesis(CIS)to gather and examine insights from scale-up experiences detailed in peer-reviewed and grey literatures,with a focus on PHIs from a variety of global settings.The CIS included iterative cycles of systematic searching,sampling,data extraction,critiquing,interpreting,coding,reflecting,and synthesizing.Theories relevant to innovations,complexity,and organizational readiness guided our analysis and synthesis.Results:We retained and examined twenty different PHI scale-up experiences,which were extracted from 77 documents(47 peer-reviewed,30 grey literature)published between 1995 and 2013.Overall,we identified three phases(i.e.,Groundwork,Implementing Scale-up,and Sustaining Scale-up),11 actions,and four key components(i.e.,PHI,context,capacity,stakeholders)pertinent to the scale-up process.Our guiding theories provided explanatory power to various aspects of the scale-up process and to scale-up success,and an alternative perspective to the assessment of scale-up readiness for a PHI.Conclusion:Our synthesis provided the foundations of the Scale-up Readiness Assessment Framework.Our theoreticallyinformed and rigorous synthesis methodology permitted identification of disparate processes involved in the successful scaleup of a PHI.Our findings complement the guidance and resources currently available,and offer an added perspective to assessing scale-up readiness for a PHI.展开更多
This paper outlines a diagnostic approach to quantify the maintainability of a Commercial off-the-Shelf (COTS)-based system by analyzing the complexity of the deployment of the system components. Interpretive Struct...This paper outlines a diagnostic approach to quantify the maintainability of a Commercial off-the-Shelf (COTS)-based system by analyzing the complexity of the deployment of the system components. Interpretive Structural Modeling (ISM) is used to demonstrate how ISM supports in identifying and understanding interdependencies among COTS components and how they affect the complexity of the maintenance of the COTS Based System (CBS). Through ISM analysis we have determined which components in the CBS contribute most significantly to the complexity of the system. With the ISM, architects, system integrators, and system maintainers can isolate the COTS products that cause the most complexity, and therefore cause the most effort to maintain, and take precautions to only change those products when necessary or during major maintenance efforts. The analysis also clearly shows the components that can be easily replaced or upgraded with very little impact on the rest of the system.展开更多
Interpretive structural modeling(ISM)is an interactive process in which a malformed(bad structured)problem is structured into a comprehensive systematic model.Yet,despite many advantages that ISM provides,this method ...Interpretive structural modeling(ISM)is an interactive process in which a malformed(bad structured)problem is structured into a comprehensive systematic model.Yet,despite many advantages that ISM provides,this method has some shortcomings,the most important one of which is its reliance on participants’intuition and judgment.This problem undermines the validity of ISM.To solve this problem and further enhance the ISM method,the present study proposes a method called equation structural modeling(ESM),which draws on the capacities of structural equation modeling(SEM).As such,ESM provides a statistically verifiable framework and provides a graphical,hierarchical and intuitive model.展开更多
The 2016–2022 monitoring data from three ecological buoys in the Wenzhou coastal region of Zhejiang Province and the dataset European Centre for Medium-Range Weather Forecasts were examined to clarify the elaborate r...The 2016–2022 monitoring data from three ecological buoys in the Wenzhou coastal region of Zhejiang Province and the dataset European Centre for Medium-Range Weather Forecasts were examined to clarify the elaborate relationship between variations in ecological parameters during spring algal bloom incidents and the associated changes in temperature and wind fields in this study.A long short-term memory recurrent neural network was employed,and a predictive model for spring algal bloom in this region was developed.This model integrated various inputs,including temperature,wind speed,and other pertinent variables,and chlorophyll concentration served as the primary output indicator.The model training used chlorophyll concentration data,which were supplemented by reanalysis and forecast temperature and wind field data.The model demonstrated proficiency in forecasting next-day chlorophyll concentrations and assessing the likelihood of spring algal bloom occurrences using a defined chlorophyll concentration threshold.The historical validation from 2016 to 2019 corroborated the model's accuracy with an 81.71%probability of correct prediction,which was further proven by its precise prediction of two spring algal bloom incidents in late April 2023 and early May 2023.An interpretable machine learning-based model for spring algal bloom prediction,displaying effective forecasting with limited data,was established through the detailed analysis of the spring algal bloom mechanism and the careful selection of input variables.The insights gained from this study offer valuable contributions to the development of early warning systems for spring algal bloom in the Wenzhou coastal area of Zhejiang Province.展开更多
In the digital age,where traditional culture is in danger of gradually disappearing,one iconic figure has taken it upon herself to reverse the trend by reviving traditional skills.A prominent Chinese vlogger,Li Ziqi,w...In the digital age,where traditional culture is in danger of gradually disappearing,one iconic figure has taken it upon herself to reverse the trend by reviving traditional skills.A prominent Chinese vlogger,Li Ziqi,whose birth name is Li Jiajia,has amassed a vast online following by reinterpreting traditions that date back thousands of years.Following a threeyear absence from social media,she made a highly anticipated return with the release of three new videos in mid-November 2024.These videos represent the culmination of several months of behind-the-scenes work.展开更多
“As a director,I hope that the audience will see a movie in cinemas instead of watching threeminute commentaries,about which I’m really speechless,”internationally renowned film director Zhang Yimou said in a recen...“As a director,I hope that the audience will see a movie in cinemas instead of watching threeminute commentaries,about which I’m really speechless,”internationally renowned film director Zhang Yimou said in a recent interview.Zhang said he believes the ritual and immersive feeling brought about by watching movies in the cinema cannot be replaced by watching movie commentaries online.He called on audiences to go to the cinema and experience in person the audio-visual feast delivered by the big screen.Zhang’s views on the so-called“threeminute movies”have become a trending topic on social media.The new genre,in which content creators on short-video platforms create summarized versions of films,also includes the creator’s own observations.The rise of these platforms has lowered barriers to entry for aspiring movie content creators,giving rise to not only high volume and great diversity but also large discrepancies in quality.Moreover,this content sometimes infringes the copyrights of the movies it features,while also misinterpreting their plots and themes.展开更多
Predicting molecular properties is essential for advancing for advancing drug discovery and design. Recently, Graph Neural Networks (GNNs) have gained prominence due to their ability to capture the complex structural ...Predicting molecular properties is essential for advancing for advancing drug discovery and design. Recently, Graph Neural Networks (GNNs) have gained prominence due to their ability to capture the complex structural and relational information inherent in molecular graphs. Despite their effectiveness, the “black-box” nature of GNNs remains a significant obstacle to their widespread adoption in chemistry, as it hinders interpretability and trust. In this context, several explanation methods based on factual reasoning have emerged. These methods aim to interpret the predictions made by GNNs by analyzing the key features contributing to the prediction. However, these approaches fail to answer critical questions: “How to ensure that the structure-property mapping learned by GNNs is consistent with established domain knowledge”. In this paper, we propose MMGCF, a novel counterfactual explanation framework designed specifically for the prediction of GNN-based molecular properties. MMGCF constructs a hierarchical tree structure on molecular motifs, enabling the systematic generation of counterfactuals through motif perturbations. This framework identifies causally significant motifs and elucidates their impact on model predictions, offering insights into the relationship between structural modifications and predicted properties. Our method demonstrates its effectiveness through comprehensive quantitative and qualitative evaluations of four real-world molecular datasets.展开更多
Predicting the material stability is essential for accelerating the discovery of advanced materials in renewable energy, aerospace, and catalysis. Traditional approaches, such as Density Functional Theory (DFT), are a...Predicting the material stability is essential for accelerating the discovery of advanced materials in renewable energy, aerospace, and catalysis. Traditional approaches, such as Density Functional Theory (DFT), are accurate but computationally expensive and unsuitable for high-throughput screening. This study introduces a machine learning (ML) framework trained on high-dimensional data from the Open Quantum Materials Database (OQMD) to predict formation energy, a key stability metric. Among the evaluated models, deep learning outperformed Gradient Boosting Machines and Random Forest, achieving up to 0.88 R2 prediction accuracy. Feature importance analysis identified thermodynamic, electronic, and structural properties as the primary drivers of stability, offering interpretable insights into material behavior. Compared to DFT, the proposed ML framework significantly reduces computational costs, enabling the rapid screening of thousands of compounds. These results highlight ML’s transformative potential in materials discovery, with direct applications in energy storage, semiconductors, and catalysis.展开更多
The aperture of natural rock fractures significantly affects the deformation and strength properties of rock masses,as well as the hydrodynamic properties of fractured rock masses.The conventional measurement methods ...The aperture of natural rock fractures significantly affects the deformation and strength properties of rock masses,as well as the hydrodynamic properties of fractured rock masses.The conventional measurement methods are inadequate for collecting data on high-steep rock slopes in complex mountainous regions.This study establishes a high-resolution three-dimensional model of a rock slope using unmanned aerial vehicle(UAV)multi-angle nap-of-the-object photogrammetry to obtain edge feature points of fractures.Fracture opening morphology is characterized using coordinate projection and transformation.Fracture central axis is determined using vertical measuring lines,allowing for the interpretation of aperture of adaptive fracture shape.The feasibility and reliability of the new method are verified at a construction site of a railway in southeast Tibet,China.The study shows that the fracture aperture has a significant interval effect and size effect.The optimal sampling length for fractures is approximately 0.5e1 m,and the optimal aperture interpretation results can be achieved when the measuring line spacing is 1%of the sampling length.Tensile fractures in the study area generally have larger apertures than shear fractures,and their tendency to increase with slope height is also greater than that of shear fractures.The aperture of tensile fractures is generally positively correlated with their trace length,while the correlation between the aperture of shear fractures and their trace length appears to be weak.Fractures of different orientations exhibit certain differences in their distribution of aperture,but generally follow the forms of normal,log-normal,and gamma distributions.This study provides essential data support for rock and slope stability evaluation,which is of significant practical importance.展开更多
In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken a...In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken as the model inputs,which brings uncertainties to LSP results.This study aims to reveal the influence rules of the different proportional random errors in conditioning factors on the LSP un-certainties,and further explore a method which can effectively reduce the random errors in conditioning factors.The original conditioning factors are firstly used to construct original factors-based LSP models,and then different random errors of 5%,10%,15% and 20%are added to these original factors for con-structing relevant errors-based LSP models.Secondly,low-pass filter-based LSP models are constructed by eliminating the random errors using low-pass filter method.Thirdly,the Ruijin County of China with 370 landslides and 16 conditioning factors are used as study case.Three typical machine learning models,i.e.multilayer perceptron(MLP),support vector machine(SVM)and random forest(RF),are selected as LSP models.Finally,the LSP uncertainties are discussed and results show that:(1)The low-pass filter can effectively reduce the random errors in conditioning factors to decrease the LSP uncertainties.(2)With the proportions of random errors increasing from 5%to 20%,the LSP uncertainty increases continuously.(3)The original factors-based models are feasible for LSP in the absence of more accurate conditioning factors.(4)The influence degrees of two uncertainty issues,machine learning models and different proportions of random errors,on the LSP modeling are large and basically the same.(5)The Shapley values effectively explain the internal mechanism of machine learning model predicting landslide sus-ceptibility.In conclusion,greater proportion of random errors in conditioning factors results in higher LSP uncertainty,and low-pass filter can effectively reduce these random errors.展开更多
Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of...Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of clinicians.Moreover,some potentially useful quantitative information in medical images,especially that which is not visible to the naked eye,is often ignored during clinical practice.In contrast,radiomics performs high-throughput feature extraction from medical images,which enables quantitative analysis of medical images and prediction of various clinical endpoints.Studies have reported that radiomics exhibits promising performance in diagnosis and predicting treatment responses and prognosis,demonstrating its potential to be a non-invasive auxiliary tool for personalized medicine.However,radiomics remains in a developmental phase as numerous technical challenges have yet to be solved,especially in feature engineering and statistical modeling.In this review,we introduce the current utility of radiomics by summarizing research on its application in the diagnosis,prognosis,and prediction of treatment responses in patients with cancer.We focus on machine learning approaches,for feature extraction and selection during feature engineering and for imbalanced datasets and multi-modality fusion during statistical modeling.Furthermore,we introduce the stability,reproducibility,and interpretability of features,and the generalizability and interpretability of models.Finally,we offer possible solutions to current challenges in radiomics research.展开更多
With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural network...With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural networks.These intelligent and automated methods significantly reduce manual labor,particularly in the laborious task of manually labeling seismic facies.However,the extensive demand for training data imposes limitations on their wider application.To overcome this challenge,we adopt the UNet architecture as the foundational network structure for seismic facies classification,which has demonstrated effective segmentation results even with small-sample training data.Additionally,we integrate spatial pyramid pooling and dilated convolution modules into the network architecture to enhance the perception of spatial information across a broader range.The seismic facies classification test on the public data from the F3 block verifies the superior performance of our proposed improved network structure in delineating seismic facies boundaries.Comparative analysis against the traditional UNet model reveals that our method achieves more accurate predictive classification results,as evidenced by various evaluation metrics for image segmentation.Obviously,the classification accuracy reaches an impressive 96%.Furthermore,the results of seismic facies classification in the seismic slice dimension provide further confirmation of the superior performance of our proposed method,which accurately defines the range of different seismic facies.This approach holds significant potential for analyzing geological patterns and extracting valuable depositional information.展开更多
文摘Ceramic relief mural is a contemporary landscape art that is carefully designed based on human nature,culture,and architectural wall space,combined with social customs,visual sensibility,and art.It may also become the main axis of ceramic art in the future.Taiwan public ceramic relief murals(PCRM)are most distinctive with the PCRM pioneered by Pan-Hsiung Chu of Meinong Kiln in 1987.In addition to breaking through the limitations of traditional public ceramic murals,Chu leveraged local culture and sensibility.The theme of art gives PCRM its unique style and innovative value throughout the Taiwan region.This study mainly analyzes and understands the design image of public ceramic murals,taking Taiwan PCRM’s design and creation as the scope,and applies STEEP analysis,that is,the social,technological,economic,ecological,and political-legal environments are analyzed as core factors;eight main important factors in the artistic design image of ceramic murals are evaluated.Then,interpretive structural modeling(ISM)is used to establish five levels,analyze the four main problems in the main core factor area and the four main target results in the affected factor area;and analyze the problem points and target points as well as their causal relationships.It is expected to sort out the relationship between these factors,obtain the hierarchical relationship of each factor,and provide a reference basis and research methods.
基金Supported by the National Natural Science Foundation of China(61473026,61104131)the Fundamental Research Funds for the Central Universities(JD1413)
文摘Alarm flood is one of the main problems in the alarm systems of industrial process. Alarm root-cause analysis and alarm prioritization are good for alarm flood reduction. This paper proposes a systematic rationalization method for multivariate correlated alarms to realize the root cause analysis and alarm prioritization. An information fusion based interpretive structural model is constructed according to the data-driven partial correlation coefficient calculation and process knowledge modification. This hierarchical multi-layer model is helpful in abnormality propagation path identification and root-cause analysis. Revised Likert scale method is adopted to determine the alarm priority and reduce the blindness of alarm handling. As a case study, the Tennessee Eastman process is utilized to show the effectiveness and validity of proposed approach. Alarm system performance comparison shows that our rationalization methodology can reduce the alarm flood to some extent and improve the performance.
文摘Through the collection of related literature,we point out the six major factors influencing China's forestry enterprises' financing: insufficient national support; regulations and institutional environmental factors; narrow channels of financing; inappropriate existing mortgagebacked approach; forestry production characteristics; forestry enterprises' defects. Then,we use interpretive structural modeling( ISM) from System Engineering to analyze the structure of the six factors and set up ladder-type structure. We put three factors including forestry production characteristics,shortcomings of forestry enterprises and regulatory,institutional and environmental factors as basic factors and put other three factors as important factors. From the perspective of the government and enterprises,we put forward some personal advices and ideas based on the basic factors and important factors to ease the financing difficulties of forestry enterprises.
基金CM received financial support through the Australian Government Research Training Program Scholarship.
文摘Background:Effective knowledge translation allows the optimisation of access to and utilisation of research knowledge in order to inform and enhance public health policy and practice.In low-and middle-income countries,there are substantial complexities that affect the way in which research can be utilised for public health action.This review attempts to draw out concepts in the literature that contribute to defining some of the complexities and contextual factors that influence knowledge translation for public health in low-and middle-income countries.Methods:A Critical Interpretive Synthesis was undertaken,a method of analysis which allows a critical review of a wide range of heterogeneous evidence,through incorporating systematic review methods with qualitative enquiry techniques.A search for peer-reviewed articles published between 2000 and 2016 on the topic of knowledge translation for public health in low-and middle-income countries was carried out,and 85 articles were reviewed and analysed using this method.Results:Four main concepts were identified:1)tension between‘global’and‘local’health research,2)complexities in creating and accessing evidence,3)contextualising knowledge translation strategies for low-and middle-income countries,and 4)the unique role of non-government organisations in the knowledge translation process.Conclusion:This method of review has enabled the identification of key concepts that may inform practice or further research in the field of knowledge translation in low-and middle-income countries.
文摘The possible risk factors during SAP Business One implementation were studied with depth interview. The results are then adjusted by experts. 20 categories of risk factors that are totally 49 factors were found. Based on the risk factors during the SAP Business One implementation, questionnaire was used to study the key risk factors of SAP Business One implementation. Results illustrate ten key risk factors, these are risk of senior managers leadership, risk of project management, risk of process improvement, risk of implementation team organization, risk of process analysis, risk of based data, risk of personnel coordination, risk of change management, risk of secondary development, and risk of data import. Focus on the key risks of SAP Business One implementation, the interpretative structural modeling approach is used to study the relationship between these factors and establish a seven-level hierarchical structure. The study illustrates that the structure is olive-like, in which the risk of data import is on the top, and the risk of senior managers is on the bottom. They are the most important risk factors.
文摘Interpretive theory brings forward three phases of interpretation: understanding, deverberlization and re-expression. It needs linguistic knowledge and non-linguistic knowledge. This essay discusses application of interpretive theory to business interpretation from the perspective of theory and practice.
文摘The interpretive theory of translation(ITT) is a school of theory originated in the late 1960 s in France,focusing on the discussion of the theory and teaching of interpreting and non-literary translation. ITT believes that what the translator should convey is not the meaning of linguistic notation,but the non-verbal sense. In this paper,the author is going to briefly introduce ITT and analyze several examples to show different situations where ITT is either useful or unsuitable.
文摘This paper aims to explore teaching of interpreting nowadays by starting from the interpretive theory and its characteristics. The author believes that the theory is mainly based on the study of interpretation practice, whose core content, namely,"deverbalization"has made great strides and breakthroughs in the theory of translation; when we examine translation, or rather interpretation once again from the bi-perspective of language and culture, we will have come across new thoughts in terms of translation as well as teaching of interpreting.
基金supported by DTKN’s Canadian Institutes of Health Research(CIHR)Population Health Intervention Research Centre(PHIRC)Doctoral Scholarship,Province of Alberta Queen Elizabeth II Scholarship,and the CIHR Population Health Intervention Research Network(PHIRNET)Doctoral Traineeshipfunded as a CIHR-NBHRF Health Systems Impact Post-Doctoral Fellow at the University of New Brunswick and University of Ottawa.
文摘Background:Population health interventions(PHIs)have the potential to improve the health of large populations by systematically addressing underlying conditions of poor health outcomes(i.e.,social determinants of health)and reducing health inequities.Scaling-up may be one means of enhancing the impact of effective PHIs.However,not all scale-up attempts have been successful.In an attempt to help guide the process of successful scale-up of a PHI,we look to the organizational readiness for change theory for a new perspective on how we may better understand the scale-up pathway.Using the change theory,our goal was to develop the foundations of an evidence-based,theory-informed framework for a PHI,through a critical examination of various PHI scale-up experiences documented in the literature.Methods:We conducted a multi-step,critical interpretive synthesis(CIS)to gather and examine insights from scale-up experiences detailed in peer-reviewed and grey literatures,with a focus on PHIs from a variety of global settings.The CIS included iterative cycles of systematic searching,sampling,data extraction,critiquing,interpreting,coding,reflecting,and synthesizing.Theories relevant to innovations,complexity,and organizational readiness guided our analysis and synthesis.Results:We retained and examined twenty different PHI scale-up experiences,which were extracted from 77 documents(47 peer-reviewed,30 grey literature)published between 1995 and 2013.Overall,we identified three phases(i.e.,Groundwork,Implementing Scale-up,and Sustaining Scale-up),11 actions,and four key components(i.e.,PHI,context,capacity,stakeholders)pertinent to the scale-up process.Our guiding theories provided explanatory power to various aspects of the scale-up process and to scale-up success,and an alternative perspective to the assessment of scale-up readiness for a PHI.Conclusion:Our synthesis provided the foundations of the Scale-up Readiness Assessment Framework.Our theoreticallyinformed and rigorous synthesis methodology permitted identification of disparate processes involved in the successful scaleup of a PHI.Our findings complement the guidance and resources currently available,and offer an added perspective to assessing scale-up readiness for a PHI.
文摘This paper outlines a diagnostic approach to quantify the maintainability of a Commercial off-the-Shelf (COTS)-based system by analyzing the complexity of the deployment of the system components. Interpretive Structural Modeling (ISM) is used to demonstrate how ISM supports in identifying and understanding interdependencies among COTS components and how they affect the complexity of the maintenance of the COTS Based System (CBS). Through ISM analysis we have determined which components in the CBS contribute most significantly to the complexity of the system. With the ISM, architects, system integrators, and system maintainers can isolate the COTS products that cause the most complexity, and therefore cause the most effort to maintain, and take precautions to only change those products when necessary or during major maintenance efforts. The analysis also clearly shows the components that can be easily replaced or upgraded with very little impact on the rest of the system.
文摘Interpretive structural modeling(ISM)is an interactive process in which a malformed(bad structured)problem is structured into a comprehensive systematic model.Yet,despite many advantages that ISM provides,this method has some shortcomings,the most important one of which is its reliance on participants’intuition and judgment.This problem undermines the validity of ISM.To solve this problem and further enhance the ISM method,the present study proposes a method called equation structural modeling(ESM),which draws on the capacities of structural equation modeling(SEM).As such,ESM provides a statistically verifiable framework and provides a graphical,hierarchical and intuitive model.
基金the Zhejiang Provincial Natural Science Foundation of China(No.LY21D 060003)the Project of State Key Laboratory of Satellite Ocean Environment Dynamics,Second Institute of Ocean-ography,MNR(No.SOEDZZ2103)+1 种基金the National Natural Science Foundation of China(No.42076216)the Open Research Fund of the Key Laboratory of Marine Ecological Monitoring and Restoration Technologies,MNR(No.MEMRT202210)。
文摘The 2016–2022 monitoring data from three ecological buoys in the Wenzhou coastal region of Zhejiang Province and the dataset European Centre for Medium-Range Weather Forecasts were examined to clarify the elaborate relationship between variations in ecological parameters during spring algal bloom incidents and the associated changes in temperature and wind fields in this study.A long short-term memory recurrent neural network was employed,and a predictive model for spring algal bloom in this region was developed.This model integrated various inputs,including temperature,wind speed,and other pertinent variables,and chlorophyll concentration served as the primary output indicator.The model training used chlorophyll concentration data,which were supplemented by reanalysis and forecast temperature and wind field data.The model demonstrated proficiency in forecasting next-day chlorophyll concentrations and assessing the likelihood of spring algal bloom occurrences using a defined chlorophyll concentration threshold.The historical validation from 2016 to 2019 corroborated the model's accuracy with an 81.71%probability of correct prediction,which was further proven by its precise prediction of two spring algal bloom incidents in late April 2023 and early May 2023.An interpretable machine learning-based model for spring algal bloom prediction,displaying effective forecasting with limited data,was established through the detailed analysis of the spring algal bloom mechanism and the careful selection of input variables.The insights gained from this study offer valuable contributions to the development of early warning systems for spring algal bloom in the Wenzhou coastal area of Zhejiang Province.
文摘In the digital age,where traditional culture is in danger of gradually disappearing,one iconic figure has taken it upon herself to reverse the trend by reviving traditional skills.A prominent Chinese vlogger,Li Ziqi,whose birth name is Li Jiajia,has amassed a vast online following by reinterpreting traditions that date back thousands of years.Following a threeyear absence from social media,she made a highly anticipated return with the release of three new videos in mid-November 2024.These videos represent the culmination of several months of behind-the-scenes work.
文摘“As a director,I hope that the audience will see a movie in cinemas instead of watching threeminute commentaries,about which I’m really speechless,”internationally renowned film director Zhang Yimou said in a recent interview.Zhang said he believes the ritual and immersive feeling brought about by watching movies in the cinema cannot be replaced by watching movie commentaries online.He called on audiences to go to the cinema and experience in person the audio-visual feast delivered by the big screen.Zhang’s views on the so-called“threeminute movies”have become a trending topic on social media.The new genre,in which content creators on short-video platforms create summarized versions of films,also includes the creator’s own observations.The rise of these platforms has lowered barriers to entry for aspiring movie content creators,giving rise to not only high volume and great diversity but also large discrepancies in quality.Moreover,this content sometimes infringes the copyrights of the movies it features,while also misinterpreting their plots and themes.
文摘Predicting molecular properties is essential for advancing for advancing drug discovery and design. Recently, Graph Neural Networks (GNNs) have gained prominence due to their ability to capture the complex structural and relational information inherent in molecular graphs. Despite their effectiveness, the “black-box” nature of GNNs remains a significant obstacle to their widespread adoption in chemistry, as it hinders interpretability and trust. In this context, several explanation methods based on factual reasoning have emerged. These methods aim to interpret the predictions made by GNNs by analyzing the key features contributing to the prediction. However, these approaches fail to answer critical questions: “How to ensure that the structure-property mapping learned by GNNs is consistent with established domain knowledge”. In this paper, we propose MMGCF, a novel counterfactual explanation framework designed specifically for the prediction of GNN-based molecular properties. MMGCF constructs a hierarchical tree structure on molecular motifs, enabling the systematic generation of counterfactuals through motif perturbations. This framework identifies causally significant motifs and elucidates their impact on model predictions, offering insights into the relationship between structural modifications and predicted properties. Our method demonstrates its effectiveness through comprehensive quantitative and qualitative evaluations of four real-world molecular datasets.
文摘Predicting the material stability is essential for accelerating the discovery of advanced materials in renewable energy, aerospace, and catalysis. Traditional approaches, such as Density Functional Theory (DFT), are accurate but computationally expensive and unsuitable for high-throughput screening. This study introduces a machine learning (ML) framework trained on high-dimensional data from the Open Quantum Materials Database (OQMD) to predict formation energy, a key stability metric. Among the evaluated models, deep learning outperformed Gradient Boosting Machines and Random Forest, achieving up to 0.88 R2 prediction accuracy. Feature importance analysis identified thermodynamic, electronic, and structural properties as the primary drivers of stability, offering interpretable insights into material behavior. Compared to DFT, the proposed ML framework significantly reduces computational costs, enabling the rapid screening of thousands of compounds. These results highlight ML’s transformative potential in materials discovery, with direct applications in energy storage, semiconductors, and catalysis.
基金This work was supported by the National Nature Science Foundation of China(Grant Nos.42177139 and 41941017)the Natural Science Foundation Project of Jilin Province,China(Grant No.20230101088JC).The authors would like to thank the anonymous reviewers for their comments and suggestions.
文摘The aperture of natural rock fractures significantly affects the deformation and strength properties of rock masses,as well as the hydrodynamic properties of fractured rock masses.The conventional measurement methods are inadequate for collecting data on high-steep rock slopes in complex mountainous regions.This study establishes a high-resolution three-dimensional model of a rock slope using unmanned aerial vehicle(UAV)multi-angle nap-of-the-object photogrammetry to obtain edge feature points of fractures.Fracture opening morphology is characterized using coordinate projection and transformation.Fracture central axis is determined using vertical measuring lines,allowing for the interpretation of aperture of adaptive fracture shape.The feasibility and reliability of the new method are verified at a construction site of a railway in southeast Tibet,China.The study shows that the fracture aperture has a significant interval effect and size effect.The optimal sampling length for fractures is approximately 0.5e1 m,and the optimal aperture interpretation results can be achieved when the measuring line spacing is 1%of the sampling length.Tensile fractures in the study area generally have larger apertures than shear fractures,and their tendency to increase with slope height is also greater than that of shear fractures.The aperture of tensile fractures is generally positively correlated with their trace length,while the correlation between the aperture of shear fractures and their trace length appears to be weak.Fractures of different orientations exhibit certain differences in their distribution of aperture,but generally follow the forms of normal,log-normal,and gamma distributions.This study provides essential data support for rock and slope stability evaluation,which is of significant practical importance.
基金This work is funded by the National Natural Science Foundation of China(Grant Nos.42377164 and 52079062)the National Science Fund for Distinguished Young Scholars of China(Grant No.52222905).
文摘In the existing landslide susceptibility prediction(LSP)models,the influences of random errors in landslide conditioning factors on LSP are not considered,instead the original conditioning factors are directly taken as the model inputs,which brings uncertainties to LSP results.This study aims to reveal the influence rules of the different proportional random errors in conditioning factors on the LSP un-certainties,and further explore a method which can effectively reduce the random errors in conditioning factors.The original conditioning factors are firstly used to construct original factors-based LSP models,and then different random errors of 5%,10%,15% and 20%are added to these original factors for con-structing relevant errors-based LSP models.Secondly,low-pass filter-based LSP models are constructed by eliminating the random errors using low-pass filter method.Thirdly,the Ruijin County of China with 370 landslides and 16 conditioning factors are used as study case.Three typical machine learning models,i.e.multilayer perceptron(MLP),support vector machine(SVM)and random forest(RF),are selected as LSP models.Finally,the LSP uncertainties are discussed and results show that:(1)The low-pass filter can effectively reduce the random errors in conditioning factors to decrease the LSP uncertainties.(2)With the proportions of random errors increasing from 5%to 20%,the LSP uncertainty increases continuously.(3)The original factors-based models are feasible for LSP in the absence of more accurate conditioning factors.(4)The influence degrees of two uncertainty issues,machine learning models and different proportions of random errors,on the LSP modeling are large and basically the same.(5)The Shapley values effectively explain the internal mechanism of machine learning model predicting landslide sus-ceptibility.In conclusion,greater proportion of random errors in conditioning factors results in higher LSP uncertainty,and low-pass filter can effectively reduce these random errors.
基金supported in part by the National Natural Science Foundation of China(82072019)the Shenzhen Basic Research Program(JCYJ20210324130209023)+5 种基金the Shenzhen-Hong Kong-Macao S&T Program(Category C)(SGDX20201103095002019)the Mainland-Hong Kong Joint Funding Scheme(MHKJFS)(MHP/005/20),the Project of Strategic Importance Fund(P0035421)the Projects of RISA(P0043001)from the Hong Kong Polytechnic University,the Natural Science Foundation of Jiangsu Province(BK20201441)the Provincial and Ministry Co-constructed Project of Henan Province Medical Science and Technology Research(SBGJ202103038,SBGJ202102056)the Henan Province Key R&D and Promotion Project(Science and Technology Research)(222102310015)the Natural Science Foundation of Henan Province(222300420575),and the Henan Province Science and Technology Research(222102310322).
文摘Modern medicine is reliant on various medical imaging technologies for non-invasively observing patients’anatomy.However,the interpretation of medical images can be highly subjective and dependent on the expertise of clinicians.Moreover,some potentially useful quantitative information in medical images,especially that which is not visible to the naked eye,is often ignored during clinical practice.In contrast,radiomics performs high-throughput feature extraction from medical images,which enables quantitative analysis of medical images and prediction of various clinical endpoints.Studies have reported that radiomics exhibits promising performance in diagnosis and predicting treatment responses and prognosis,demonstrating its potential to be a non-invasive auxiliary tool for personalized medicine.However,radiomics remains in a developmental phase as numerous technical challenges have yet to be solved,especially in feature engineering and statistical modeling.In this review,we introduce the current utility of radiomics by summarizing research on its application in the diagnosis,prognosis,and prediction of treatment responses in patients with cancer.We focus on machine learning approaches,for feature extraction and selection during feature engineering and for imbalanced datasets and multi-modality fusion during statistical modeling.Furthermore,we introduce the stability,reproducibility,and interpretability of features,and the generalizability and interpretability of models.Finally,we offer possible solutions to current challenges in radiomics research.
基金funded by the Fundamental Research Project of CNPC Geophysical Key Lab(2022DQ0604-4)the Strategic Cooperation Technology Projects of China National Petroleum Corporation and China University of Petroleum-Beijing(ZLZX 202003)。
文摘With the successful application and breakthrough of deep learning technology in image segmentation,there has been continuous development in the field of seismic facies interpretation using convolutional neural networks.These intelligent and automated methods significantly reduce manual labor,particularly in the laborious task of manually labeling seismic facies.However,the extensive demand for training data imposes limitations on their wider application.To overcome this challenge,we adopt the UNet architecture as the foundational network structure for seismic facies classification,which has demonstrated effective segmentation results even with small-sample training data.Additionally,we integrate spatial pyramid pooling and dilated convolution modules into the network architecture to enhance the perception of spatial information across a broader range.The seismic facies classification test on the public data from the F3 block verifies the superior performance of our proposed improved network structure in delineating seismic facies boundaries.Comparative analysis against the traditional UNet model reveals that our method achieves more accurate predictive classification results,as evidenced by various evaluation metrics for image segmentation.Obviously,the classification accuracy reaches an impressive 96%.Furthermore,the results of seismic facies classification in the seismic slice dimension provide further confirmation of the superior performance of our proposed method,which accurately defines the range of different seismic facies.This approach holds significant potential for analyzing geological patterns and extracting valuable depositional information.