Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffus...Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffusion models offer unsupervised means of solving a wide range of inverse problems via posterior sampling.In particular,using the estimated unconditional score function of the prior distribution,obtained via unsupervised learning,one can sample from the desired posterior via hijacking and regularization.However,due to the iterative solvers used,the number of function evaluations(NFE)required may be orders of magnitudes larger than for single-step samplers.In this paper,we present a novel image denoising technique for photon-counting CT by extending the unsupervised approach to inverse problem solving to the case of Poisson flow generative models(PFGM)++.By hijacking and regularizing the sampling process we obtain a single-step sampler,that is NFE=1.Our proposed method incorporates posterior sampling using diffusion models as a special case.We demonstrate that the added robustness afforded by the PFGM++framework yields significant performance gains.Our results indicate competitive performance compared to popular supervised,including state-of-the-art diffusion-style models with NFE=1(consistency models),unsupervised,and non-DL-based image denoising techniques,on clinical low-dose CT data and clinical images from a prototype photon-counting CT system developed by GE HealthCare.展开更多
This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estima...This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.展开更多
Zero-Inflated Poisson model has found a wide variety of applications in recent years in statistical analyses of count data, especially in count regression models. Zero-Inflated Poisson model is characterized in this p...Zero-Inflated Poisson model has found a wide variety of applications in recent years in statistical analyses of count data, especially in count regression models. Zero-Inflated Poisson model is characterized in this paper through a linear differential equation satisfied by its probability generating function [1] [2].展开更多
Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This anal...Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This analysis shows how to theoretically and optimally align staffing to demand. Methods: The ED value stream was identified and mapped. Patients were stratified into three resource-driven care flow cells based on the severity indices. Time observations were conducted for each of the key care team members and the manual cycle times and service rate were calculated and stratified by severity indices. Using X32 Healthcare’s Online Staffing Optimization (OSO) tool, staffing inefficiencies were identified and an optimal schedule was created for each provider group. Results: Lower Severity Indices (higher acuity patient) led to longer times for providers, nurses, patient care assistants, and clerks. The patient length of stay varied from under one hour to over five hours. The flow of patients varied considerably over the 24 hours’ period but was similar by day of the week. Using flow data, we showed that we needed more nurses, more care team members during peak times of patient flow. Eight hour shifts would allow better flexibility. We showed that the additional salary hours added to the budget would be made up for by increased revenue recognized by decreasing the number of patients who leave without being seen. Conclusion: If implemented, these changes will improve ED flow by using lean tools and principles, ultimately leading to timeliness of care, reduced waits, and improved patient experience.展开更多
In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability fun...In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability function for the total number of taxation periods over the lifetime of the surplus process is derived. Second, analytical expression of the expected accumulated discounted dividends paid between two consecutive taxation periods is provided. In addition, explicit expressions are also given for the exponential individual claims.展开更多
In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generaliz...In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generalized Poisson process. This model includes the classical risk model and the Pólya-Aeppli risk model as special cases. The objective is to find a dividend policy so as to maximize the expected discounted value of dividends which are paid to the shareholders until the company is ruined. We show that under some conditions the optimal dividend strategy is formed by a barrier strategy. Moreover, two conjectures are proposed.展开更多
In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult....In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult.Unfortunately,there are few studies on the failure and support mechanism of the surrounding rocks in the excavation of supported tunnel,while most model tests of super-large-span tunnels focus on the failure characteristics of surrounding rocks in tunnel excavation without supports.Based on excavation compensation method(ECM),model tests of a super-large-span tunnel excavation by different anchor cable support methods in the initial support stage were carried out.The results indicate that during excavation of super-large-span tunnel,the stress and displacement of the shallow surrounding rocks decrease,following a step-shape pattern,and the tunnel failure is mainly concentrated on the vault and spandrel areas.Compared with conventional anchor cable supports,the NPR(negative Poisson’s ratio)anchor cable support is more suitable for the initial support stage of the super-large-span tunnels.The tunnel support theory,model test materials,methods,and the results obtained in this study could provide references for study of similar super-large-span tunnels。展开更多
The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradatio...The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradation process,cumulative damage model is used for degradation modeling.Assuming that damage increment is Gamma distribution,shock counting subjects to a homogeneous Poisson process(HPP)when degradation process is linear,and shock counting is a non-homogeneous Poisson process(NHPP)when degradation process is nonlinear.A two-stage degradation system is considered in this paper,for which the degradation process is linear in the first stage and the degradation process is nonlinear in the second stage.A nonlinear modeling method for considered system is put forward,and reliability model and remaining useful life model are established.A case study is given to validate the veracities of established models.展开更多
In this paper, a hybrid dividend strategy in the compound Poisson risk model is considered. In the absence of dividends, the surplus of an insurance company is modelled by a compound Poisson process. Dividends are pai...In this paper, a hybrid dividend strategy in the compound Poisson risk model is considered. In the absence of dividends, the surplus of an insurance company is modelled by a compound Poisson process. Dividends are paid at a constant rate whenever the modified surplus is in a interval;the premium income no longer goes into the surplus but is paid out as dividends whenever the modified surplus exceeds the upper bound of the interval, otherwise no dividends are paid. Integro-differential equations with boundary conditions satisfied by the expected total discounted dividends until ruin are derived;for example, closed-form solutions are given when claims are exponentially distributed. Accordingly, the moments and moment-generating functions of total discounted dividends until ruin are considered. Finally, the Gerber-Shiu function and Laplace transform of the ruin time are discussed.展开更多
Compound Poisson risk model has been simulated. It has started with exponential claim sizes. The simulations have checked for infinite ruin probabilities. An appropriate time window has been chosen to estimate and com...Compound Poisson risk model has been simulated. It has started with exponential claim sizes. The simulations have checked for infinite ruin probabilities. An appropriate time window has been chosen to estimate and compare ruin probabilities. The infinite ruin probabilities of two-compound Poisson risk process have estimated and compared them with standard theoretical results.展开更多
A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model wi...A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model with estimation procedure and tests for goodness-of-fit and under (or over) dispersion are shown and applied to road safety data. Two correlated outcome variables considered in this study are number of cars involved in an accident and number of casualties for given number of cars.展开更多
Pasteuria penetrans controls root knots nematodes (Meloidogyne spp.) either by preventing invasion or by causing female sterility. The greatest control effect ofP. penetrans occurred when an efficient quantity ofP. ...Pasteuria penetrans controls root knots nematodes (Meloidogyne spp.) either by preventing invasion or by causing female sterility. The greatest control effect ofP. penetrans occurred when an efficient quantity ofP. penetrans spores attached to nematodes cuticle. The number of spores attaching to J2s within a given time increased with increasing the time of attachment. Based to that, we produced attachment data in vitro recorded encumbered nematodes 1, 3, 6 and 9 h after placing nematodes in a standard P. penetrans spore suspensions. From the count data obtained we modeled P. penetrans attachment using the Poisson and the negative binomial distribution. Attachment count data observed to be over dispersed with respect to high numbers of spores sticks on each J2 after at 6 and 9 h after spores application. We concluded that negative binomial distribution was shown to be the most appropriate model to fit the observed data sets considering that P. penetrans spores are clumped.展开更多
This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquak...This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquakes occurfollowing the Poisson process and their magnitudes obey the Gutenberg-Richter law.Theresults show that Y<sub>1</sub>~Y<sub>5</sub> are quite stable or concentrated when the expected number of eventsin the calculation time window is relatively large(】40);but when this occurrence ratebecomes very small,Y<sub>2</sub>~Y<sub>5</sub> become quite variable or unstable.That is to say,a high value ofthe LURR can be produced not only from seismicity before a large earthquake,but also from arandom sequence of earthquakes that obeys a Poisson process when the expected number ofevents in the window is too small.To check the influence of randomness in the catalogue tothe LURR,the random distribution of the LURR under Poisson models has been calculated bysimulation.90%,95% and 99% confidence ranges of Y<sub>1</sub> and Y<sub>3</sub> are given in this paper,which is helpful to quantify the random展开更多
Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biase...Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biasedness and inconsistency in the estimated parameters in the stop frequency models. Additionally, previous studies on the stop frequency have mostly been done in larger metropolitan areas and less attention has been paid to the areas with less population. This study addresses these gaps by using 2012 travel data from a medium sized U.S. urban area using the work tour for the case study. Stop in the work tour were classified into three groups of outbound leg, work based subtour, and inbound leg of the commutes. Latent Class Poisson Regression Models were used to analyze the data. The results indicate the presence of heterogeneity across the commuters. Using latent class models significantly improves the predictive power of the models compared to regular one class Poisson regression models. In contrast to one class Poisson models, gender becomes insignificant in predicting the number of tours when unobserved heterogeneity is accounted for. The commuters are associated with increased stops on their work based subtour when the employment density of service-related occupations increases in their work zone, but employment density of retail employment does not significantly contribute to the stop making likelihood of the commuters. Additionally, an increase in the number of work tours was associated with fewer stops on the inbound leg of the commute. The results of this study suggest the consideration of unobserved heterogeneity in the stop frequency models and help transportation agencies and policy makers make better inferences from such models.展开更多
In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. ...In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.展开更多
We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson p...We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.展开更多
A high-precision regional gravity field model is significant in various geodesy applications.In the field of modelling regional gravity fields,the spherical radial basis functions(SRBFs)approach has recently gained wi...A high-precision regional gravity field model is significant in various geodesy applications.In the field of modelling regional gravity fields,the spherical radial basis functions(SRBFs)approach has recently gained widespread attention,while the modelling precision is primarily influenced by the base function network.In this study,we propose a method for constructing a data-adaptive network of SRBFs using a modified Hierarchical Density-Based Spatial Clustering of Applications with Noise(HDBSCAN)algorithm,and the performance of the algorithm is verified by the observed gravity data in the Auvergne area.Furthermore,the turning point method is used to optimize the bandwidth of the basis function spectrum,which satisfies the demand for both high-precision gravity field and quasi-geoid modelling simultaneously.Numerical experimental results indicate that our algorithm has an accuracy of about 1.58 mGal in constructing the gravity field model and about 0.03 m in the regional quasi-geoid model.Compared to the existing methods,the number of SRBFs used for modelling has been reduced by 15.8%,and the time cost to determine the centre positions of SRBFs has been saved by 12.5%.Hence,the modified HDBSCAN algorithm presented here is a suitable design method for constructing the SRBF data adaptive network.展开更多
基金supported by MedTechLabs,GE HealthCare,the Swedish Research council,No.2021-05103the Göran Gustafsson foundation,No.2114.
文摘Deep learning(DL)has proven to be important for computed tomography(CT)image denoising.However,such models are usually trained under supervision,requiring paired data that may be difficult to obtain in practice.Diffusion models offer unsupervised means of solving a wide range of inverse problems via posterior sampling.In particular,using the estimated unconditional score function of the prior distribution,obtained via unsupervised learning,one can sample from the desired posterior via hijacking and regularization.However,due to the iterative solvers used,the number of function evaluations(NFE)required may be orders of magnitudes larger than for single-step samplers.In this paper,we present a novel image denoising technique for photon-counting CT by extending the unsupervised approach to inverse problem solving to the case of Poisson flow generative models(PFGM)++.By hijacking and regularizing the sampling process we obtain a single-step sampler,that is NFE=1.Our proposed method incorporates posterior sampling using diffusion models as a special case.We demonstrate that the added robustness afforded by the PFGM++framework yields significant performance gains.Our results indicate competitive performance compared to popular supervised,including state-of-the-art diffusion-style models with NFE=1(consistency models),unsupervised,and non-DL-based image denoising techniques,on clinical low-dose CT data and clinical images from a prototype photon-counting CT system developed by GE HealthCare.
文摘This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.
文摘Zero-Inflated Poisson model has found a wide variety of applications in recent years in statistical analyses of count data, especially in count regression models. Zero-Inflated Poisson model is characterized in this paper through a linear differential equation satisfied by its probability generating function [1] [2].
文摘Introduction: Studies have shown Emergency Department (ED) crowding contributes to reduced quality of patient care, delays in starting treatments, and increased number of patients leaving without being seen. This analysis shows how to theoretically and optimally align staffing to demand. Methods: The ED value stream was identified and mapped. Patients were stratified into three resource-driven care flow cells based on the severity indices. Time observations were conducted for each of the key care team members and the manual cycle times and service rate were calculated and stratified by severity indices. Using X32 Healthcare’s Online Staffing Optimization (OSO) tool, staffing inefficiencies were identified and an optimal schedule was created for each provider group. Results: Lower Severity Indices (higher acuity patient) led to longer times for providers, nurses, patient care assistants, and clerks. The patient length of stay varied from under one hour to over five hours. The flow of patients varied considerably over the 24 hours’ period but was similar by day of the week. Using flow data, we showed that we needed more nurses, more care team members during peak times of patient flow. Eight hour shifts would allow better flexibility. We showed that the additional salary hours added to the budget would be made up for by increased revenue recognized by decreasing the number of patients who leave without being seen. Conclusion: If implemented, these changes will improve ED flow by using lean tools and principles, ultimately leading to timeliness of care, reduced waits, and improved patient experience.
基金Supported in part by the National Natural Science Foundation of China, the Guangdong Natural Science Foundation (S2011010004511)the Fundamental Research Funds for the Central Universities of China (201120102020005)
文摘In this paper, we consider a compound Poisson risk model with taxes paid according to a loss-carry-forward system and dividends paid under a threshold strategy. First, the closed-form expression of the probability function for the total number of taxation periods over the lifetime of the surplus process is derived. Second, analytical expression of the expected accumulated discounted dividends paid between two consecutive taxation periods is provided. In addition, explicit expressions are also given for the exponential individual claims.
文摘In this note we study the optimal dividend problem for a company whose surplus process, in the absence of dividend payments, evolves as a generalized compound Poisson model in which the counting process is a generalized Poisson process. This model includes the classical risk model and the Pólya-Aeppli risk model as special cases. The objective is to find a dividend policy so as to maximize the expected discounted value of dividends which are paid to the shareholders until the company is ruined. We show that under some conditions the optimal dividend strategy is formed by a barrier strategy. Moreover, two conjectures are proposed.
基金supported by the Innovation Fund Research Project of State Key Laboratory for Geomechanics and Deep Underground Engineering,China University of Mining and Technology(Grant No.SKLGDUEK202201)the Foundation for the Opening of State Key Laboratory for Geomechanics and Deep Underground Engineering,China University of Mining and Technology(Grant No.SKLGDUEK2129)the Open Research Fund of State Key Laboratory of Geomechanics and Geotechnical Engineering,Institute of Rock and Soil Mechanics,Chinese Academy of Sciences(Grant No.Z020007)。
文摘In recent years,there is a scenario in urban tunnel constructions to build super-large-span tunnels for traffic diversion and route optimization purposes.However,the increased size makes tunnel support more difficult.Unfortunately,there are few studies on the failure and support mechanism of the surrounding rocks in the excavation of supported tunnel,while most model tests of super-large-span tunnels focus on the failure characteristics of surrounding rocks in tunnel excavation without supports.Based on excavation compensation method(ECM),model tests of a super-large-span tunnel excavation by different anchor cable support methods in the initial support stage were carried out.The results indicate that during excavation of super-large-span tunnel,the stress and displacement of the shallow surrounding rocks decrease,following a step-shape pattern,and the tunnel failure is mainly concentrated on the vault and spandrel areas.Compared with conventional anchor cable supports,the NPR(negative Poisson’s ratio)anchor cable support is more suitable for the initial support stage of the super-large-span tunnels.The tunnel support theory,model test materials,methods,and the results obtained in this study could provide references for study of similar super-large-span tunnels。
基金National Outstanding Youth Science Fund Project,China(No.71401173)
文摘The degradation process modeling is one of research hotspots of prognostic and health management(PHM),which can be used to estimate system reliability and remaining useful life(RUL).In order to study system degradation process,cumulative damage model is used for degradation modeling.Assuming that damage increment is Gamma distribution,shock counting subjects to a homogeneous Poisson process(HPP)when degradation process is linear,and shock counting is a non-homogeneous Poisson process(NHPP)when degradation process is nonlinear.A two-stage degradation system is considered in this paper,for which the degradation process is linear in the first stage and the degradation process is nonlinear in the second stage.A nonlinear modeling method for considered system is put forward,and reliability model and remaining useful life model are established.A case study is given to validate the veracities of established models.
文摘In this paper, a hybrid dividend strategy in the compound Poisson risk model is considered. In the absence of dividends, the surplus of an insurance company is modelled by a compound Poisson process. Dividends are paid at a constant rate whenever the modified surplus is in a interval;the premium income no longer goes into the surplus but is paid out as dividends whenever the modified surplus exceeds the upper bound of the interval, otherwise no dividends are paid. Integro-differential equations with boundary conditions satisfied by the expected total discounted dividends until ruin are derived;for example, closed-form solutions are given when claims are exponentially distributed. Accordingly, the moments and moment-generating functions of total discounted dividends until ruin are considered. Finally, the Gerber-Shiu function and Laplace transform of the ruin time are discussed.
文摘Compound Poisson risk model has been simulated. It has started with exponential claim sizes. The simulations have checked for infinite ruin probabilities. An appropriate time window has been chosen to estimate and compare ruin probabilities. The infinite ruin probabilities of two-compound Poisson risk process have estimated and compared them with standard theoretical results.
文摘A new covariate dependent zero-truncated bivariate Poisson model is proposed in this paper employing generalized linear model. A marginal-conditional approach is used to show the bivariate model. The proposed model with estimation procedure and tests for goodness-of-fit and under (or over) dispersion are shown and applied to road safety data. Two correlated outcome variables considered in this study are number of cars involved in an accident and number of casualties for given number of cars.
文摘Pasteuria penetrans controls root knots nematodes (Meloidogyne spp.) either by preventing invasion or by causing female sterility. The greatest control effect ofP. penetrans occurred when an efficient quantity ofP. penetrans spores attached to nematodes cuticle. The number of spores attaching to J2s within a given time increased with increasing the time of attachment. Based to that, we produced attachment data in vitro recorded encumbered nematodes 1, 3, 6 and 9 h after placing nematodes in a standard P. penetrans spore suspensions. From the count data obtained we modeled P. penetrans attachment using the Poisson and the negative binomial distribution. Attachment count data observed to be over dispersed with respect to high numbers of spores sticks on each J2 after at 6 and 9 h after spores application. We concluded that negative binomial distribution was shown to be the most appropriate model to fit the observed data sets considering that P. penetrans spores are clumped.
基金This project was sponsored by the National Soience Foundation of China(19702060)
文摘This paper discussed the random distribution of the loading and unloading response ratio(LURR) of different definitions(Y<sub>1</sub>~Y<sub>5</sub>)using the assumptions that the earthquakes occurfollowing the Poisson process and their magnitudes obey the Gutenberg-Richter law.Theresults show that Y<sub>1</sub>~Y<sub>5</sub> are quite stable or concentrated when the expected number of eventsin the calculation time window is relatively large(】40);but when this occurrence ratebecomes very small,Y<sub>2</sub>~Y<sub>5</sub> become quite variable or unstable.That is to say,a high value ofthe LURR can be produced not only from seismicity before a large earthquake,but also from arandom sequence of earthquakes that obeys a Poisson process when the expected number ofevents in the window is too small.To check the influence of randomness in the catalogue tothe LURR,the random distribution of the LURR under Poisson models has been calculated bysimulation.90%,95% and 99% confidence ranges of Y<sub>1</sub> and Y<sub>3</sub> are given in this paper,which is helpful to quantify the random
文摘Stop frequency models, as one of the elements of activity based models, represent an important part of travel behavior. Unobserved heterogeneity across the travelers should be taken into consideration to prevent biasedness and inconsistency in the estimated parameters in the stop frequency models. Additionally, previous studies on the stop frequency have mostly been done in larger metropolitan areas and less attention has been paid to the areas with less population. This study addresses these gaps by using 2012 travel data from a medium sized U.S. urban area using the work tour for the case study. Stop in the work tour were classified into three groups of outbound leg, work based subtour, and inbound leg of the commutes. Latent Class Poisson Regression Models were used to analyze the data. The results indicate the presence of heterogeneity across the commuters. Using latent class models significantly improves the predictive power of the models compared to regular one class Poisson regression models. In contrast to one class Poisson models, gender becomes insignificant in predicting the number of tours when unobserved heterogeneity is accounted for. The commuters are associated with increased stops on their work based subtour when the employment density of service-related occupations increases in their work zone, but employment density of retail employment does not significantly contribute to the stop making likelihood of the commuters. Additionally, an increase in the number of work tours was associated with fewer stops on the inbound leg of the commute. The results of this study suggest the consideration of unobserved heterogeneity in the stop frequency models and help transportation agencies and policy makers make better inferences from such models.
文摘In this work, some non-homogeneous Poisson models are considered to study the behaviour of ozone in the city of Puebla, Mexico. Several functions are used as the rate function for the non-homogeneous Poisson process. In addition to their dependence on time, these rate functions also depend on some parameters that need to be estimated. In order to estimate them, a Bayesian approach will be taken. The expressions for the distributions of the parameters involved in the models are very complex. Therefore, Markov chain Monte Carlo algorithms are used to estimate them. The methodology is applied to the ozone data from the city of Puebla, Mexico.
基金financially supported by the project PAPIIT number IN104110-3 of the Direccion General de Apoyo al Personal Academico of the Universidad Nacional Autonoma de Mexico,Mexico,and is part of JMB’s Ph.D.partially funded by the Consejo Nacional de Ciencias y Tecnologia,Mexico,through the Ph.D.Scholarship number 210347JAA was partially funded by the Conselho Nacional de Pesquisa,Brazil,grant number 300235/2005-4.
文摘We consider some non-homogeneous Poisson models to estimate the mean number of times that a given environmental threshold of interest is surpassed by a given pollutant. Seven different rate functions for the Poisson processes describing the models are taken into account. The rate functions considered are the Weibull, exponentiated-Weibull, and their generalisation the Beta-Weibull rate function. We also use the Musa-Okumoto, the Goel-Okumoto, a generalised Goel- Okumoto and the Weibull-geometric rate functions. Whenever thought justifiable, the model allowing the presence of change-points is also going to be considered. The different models are applied to the daily maximum ozone measurements data provided by the monitoring network of the Metropolitan Area of Mexico City. The aim is to compare the adjustment of different rate functions to the data. Even though, some of the rate functions have been considered before, now we are applying them to the same data set. In previous works they were used in different data sets and therefore a comparison of the adequacy of those models were not possible. The measurements considered here were obtained after a series of environmental measures were implemented in Mexico City. Hence, the data present a different behaviour from that of earlier studies.
基金funded by The Fundamental Research Funds for Chinese Academy of surveying and mapping(AR2402)Open Fund of Wuhan,Gravitation and Solid Earth Tides,National Observation and Research Station(No.WHYWZ202213)。
文摘A high-precision regional gravity field model is significant in various geodesy applications.In the field of modelling regional gravity fields,the spherical radial basis functions(SRBFs)approach has recently gained widespread attention,while the modelling precision is primarily influenced by the base function network.In this study,we propose a method for constructing a data-adaptive network of SRBFs using a modified Hierarchical Density-Based Spatial Clustering of Applications with Noise(HDBSCAN)algorithm,and the performance of the algorithm is verified by the observed gravity data in the Auvergne area.Furthermore,the turning point method is used to optimize the bandwidth of the basis function spectrum,which satisfies the demand for both high-precision gravity field and quasi-geoid modelling simultaneously.Numerical experimental results indicate that our algorithm has an accuracy of about 1.58 mGal in constructing the gravity field model and about 0.03 m in the regional quasi-geoid model.Compared to the existing methods,the number of SRBFs used for modelling has been reduced by 15.8%,and the time cost to determine the centre positions of SRBFs has been saved by 12.5%.Hence,the modified HDBSCAN algorithm presented here is a suitable design method for constructing the SRBF data adaptive network.