This paper comprehensively uses Bayesian estimation, comparison of the second-order moment, impulse response, variance decomposition, historical decomposition and counterfactual simulation to arrive at the conclusions as follows:in the case of introduction of financial frictions and financial shocks, 1) the loss of welfare under the interest rate (price) rule is greater than the quantity (money supply) rule. 2) The quantity rule is better than the interest rate rule in cope with the effect of the negative financial shocks (financial crisis). 3) Financial shocks play the primary role in explaining the volatility of output, investment, debt and employment under both the interest rate rule and the quantity rule, and they explain 40.54% and 38.95% of the variances of output respectively in China. 4) The choice of monetary policy rule has a great influence on the economy of China, and the effect of it on inflation is greater than the effect on the output. 5) In the case of the interest rate being not fully market-oriented and the existence of financial shocks, the People's Bank of China should adopt more quantity rules to regulate the economy.
Based on LMM with switching regime and stochastic volatility, theoretical methods on CMSSO are deeply analysized and explored by mean of Fourier analysis and Feynman-Katz theorem. Firstly, according to the connotation feature and value composition of CMSSO, we put forward a theoretical computational framework for pricing this product. Secondly, basing on the random and switching jump features of Libor and Swap rates, we set up LMM/SMM with stochastic volatility and switching regime, and make an effective calibration and estimation for models parameters by using Black-backstepping and adaptive MCMC. Lastly, on the basic of SRSV-LMM and SRSV-SMM, the theoretical computational formulas are given, and their empirical computation and comparative analysis are made by means of Fourier analysis and Feynman-Katz theorem. The research conclusion is:according to the Monte Carlo simulation result of forward Libor and Swap rates paths, SRSV-LMM、SRSV-SMM have less simulation errors and better simulation effect. Further, compared with Monte Carlo method, our methods reflect better empirical result basing on computation time and simulation accuracy.
Given that stock fluctuation has significant multi-scale features, a novel Value-at-Risk (VaR) model is proposed by combining the binary empirical mode decomposition (BEMD) and Copula-GARCH algorithm, i.e., BEMD-Copula-GARCH model. In the proposed model, three main steps are included, i.e., data decomposition, individual risk measurement, and total risk integration. First, the binary EMD technique is employed to decomposed the pair of complex and interactive stock series into pairs of relatively simple and independent components, to reduce the modeling difficulty. Second, Copula-GARCH model is introduced to individually capture the dynamic dependence between the decomposed components in pairs, in terms of VaR at different time-scale. Finally, the individual results are integrated into the final VaR measure. In the empirical study, the portfolio with equal-weighted Hang Seng Index and Shanghai Composite index is analyzed, and the results indicate that the proposed model outperform the benchmark DCC-GARCH model and Copula-GARCH model in terms of the risk measurement accuracy.
Finance theory suggests that the price of capital will change due to the impact of unexpected important events. Many papers had discussed whether there is a significant relationship between stock price and media reports, but the conclusions did not come to an agreement. In this paper, we take the liquor plasticizer event as an example to analyze the influence of exposure news' amount on stock prices of listed companies. Studies show that the exposure news of plasticizer affects stock prices of liquor plate over an extended period, but the impact on each stock price varies. In addition, the study also finds that the central government's anti-corruption action suppresses stock prices of liquor plate.
As a special kind of public goods, threshold public goods have a unique feature besides the non-excludability and non-competitive shared by other public goods, i.e., it can be provided only if the aggregate supply exceeds a critical level, which we call "threshold", and vice versa. This paper sets up a model to analyze the effects of the order of moves (simultaneous vs. sequential) on the provision of threshold public goods theoretically. I compare effects of simultaneous game and sequential game on social welfare, fairness and success rate. The conclusion is that there are higher level of social welfare and fairness but low success rate in simultaneous game than that in sequential game.
This paper studied the ripple effects between first-tier cities' and second-tier cities'/third-tier cities' house prices under the background of flee "Beijing、Shanghai、Guangzhou、Shenzhen" ("BSGS") and analyzed the influence of factors on the ripple effect from three aspects:city size, distance from first-tier city, degree of ripple effect. It showed that:1) The degree of industrial similarity, gap of life amenity, transport convenience,information flow efficiency between cities, gap of culture, rate of employment growth, city openness, total savings had significant positive influence on ripple effects under the background of flee "Beijing、Shanghai、Guangzhou、Shenzhen" ("BSGS"). 2) The influence degree of factors in second-tier cities were bigger than third-tier cities'. 3) With increasing of distances from first-tier city, the influence degree of industrial similarity, transport convenience, rate of employment growth, city openness became weaker, the influence degree of gap of life amenity became stronger, and there was no relationship between distance and influence degree of information flow efficiency between cities, total savings. 4) With quantile increasing, most factors' influence degree had enhanced. It showed the "accelerating effect".
Using the provinces data of China from 1997 to 2012, this article built the semi-parametric panel space vector auto-regression model to empirical test the impulse response on foreign direct investment (FDI), innovation and economic growth in time and space. The estimation results show that:1) FDI on economic growth lagged effects of time and space are positive. The innovation on FDI lagged effect of time and space are positive, its spatial lag effect is 0.8. Innovation on economic growth lagged effects of time and space are negative. Economic growth's time lagged effect on FDI, innovation and economic growth are all positive, its spatial lag effects are all negative, what's more, the economic growth's spatial lagged effects on FDI and innovation are much larger, namely -0.86 and -0.43. 2) When the impact source of FDI and innovation is Jiangsu Province, the other provinces in response to basic convergence to zero after six, and with the neighboring Jiangsu, Shandong and Zhejiang province have a more significant response, the smaller the impact of the other cities as the distance increases; the impact source of economic growth in Guangdong Province, the response to other cities after six basic convergence to zero, the smaller the impact of the other cities.
In order to study the effect of the entrepreneur's political connections with local government on technological innovation performance, we choose R&D investment as mediator variable and low-cost strategy as moderator variable, trying to make a deep analysis of the specific mechanism between them. Selecting data from listed companies on GEM in Shenzhen Stock Exchange, based on the theory of mutual return of social capital, we do empirical study by using SPSS software and find:the entrepreneur's political connections with local government go against technological innovation performance, R&D investment plays a mediating role between them, low-cost strategy plays a moderating role to weaken the influence of R&D investment in technological innovation performance. On the one hand, the local government shouldn't just focus on the growth of GDP and exert too much pressure on the enterprise, but to perfect the appraisal rules and provide them a platform to explore and innovate. On the other hand, when the connection is established, the enterprise should make full use of scarce resources and convenient condition to invest into innovative projects for new changes. It is a good use of resources and a fundamental policy for the sustainable development of enterprises. Meanwhile, the low-cost strategy is not conducive to innovative output. In daily business activities, enterprises shouldn't blindly pursue low-cost and high-profit business model, but view things in the long run and deploy more innovative product investment.
It is a very difficult problem in Chinese medical reform as to how to remove the artificially high part of the drug price in the centralized bid-and-procurement scheme. From the perspective of mechanism design, this paper demonstrates the drug's pricing mechanism in its whole circulation process in detail-from the bid to the procurement, to the prescription and to the use of drug. Then, we analyze the origin of high drug price and associated profit ring. Further, several popular medical reform measurements are discussed in terms of their contribution to removing the artificially high part of the drug price. Finally, by using of mechanism design theory, a practical policy tool is provided in order to completely remove the artificially high part.
To explore the effects of consumer's preference on product pricing and targeted advertising, the mathematical model with asymmetric duopolist using targeted advertising and price discrimination on firm's profit is established, and related case study is performed. The results indicate that consumer's preference directly affects the firm's profit. More targeted advertising are sent to the firm's strong market for higher benefits even the price discrimination is not allowed. However, when the price discrimination is permitted, both firms will send more ads in their strong market and provide a high price to the consumer. Otherwise, the strategy of price discrimination and targeted advertising can be used as the effective tool for firm's marketing strategy. When the firm has the imperfect targeting precision, the strategy of targeted advertising and price discrimination might intensify market competition.
The paper uses the online sequential prediction algorithm of aggregating expert advices to study the multi-product multi-period newsvendor problem. Taking any fixed order quantity strategy as an expert advice, this paper utilizes weak aggregating algorithm (WAA) to build online ordering strategies, and provides theoretical guarantees on these strategies based on the competitive theory of WAA. Firstly, this paper provides online ordering strategy for the two-product multi-period newsvendor problem, and proves that the cumulative gain it achieves is as large as that of the best fixed order quantity strategy. Then, the online ordering strategy and its theoretical guarantee for the two-product multi-period newsvendor problem are generalized for the multi-product multi-period newsvendor problem. Lastly, under different demand types, the numerical examples are used to illustrate that the online ordering strategies this paper builds preserve strong competitive performance compared with the best fixed order quantity strategies.
This paper analyzes the difference between the Prosper and PPDai platform from the operation mode, the risk control and the model of charges, and then shortcomings are pointed out on the threshold of the borrowers, lending bills, the charges, and credit conditions. Thus on the basis of Prosper operating mechanism, this paper puts forward a kind of dynamic reverse auction mechanism (DRAM), and it is to study the trading mechanism of online P2P lending of China. In the process, the paper firstly demonstrates the incentive compatibility features of DRAM, and demonstrates the Nash equilibrium of DRAM. Further, this paper sets up two kinds of auction pricing strategies include the certainty of auction pricing time and uncertainty of auction pricing time. Finally, through the example analysis, the paper points out that the first auction pricing strategy is helpful to improve the lender earnings, and disperse investment risk. Therefore, it can be used to improve the current operation mode of online P2P lending of China.
When we evaluate the efficiency by applying data envelopment analysis (DEA), it is necessary that all of the indexes must have a preference, that is, the bigger the better or the smaller the better. However, when the neutral indicator is among the evaluation index system, the traditional DEA methods will unable to solve such issues. Therefore, based on the background of economic efficiency and industrial structure adjustment, this paper provides a data envelopment analysis model with some neutral indexes by the systemic science view. The model not only can give the efficiency of an economic system, but also can provide the system how to improve the efficiency by the adjustment of industrial structure. Finally, this method is applied to analyze the related problems of economic structure adjustment in Tianjin.
From a view of efficiency change, scale change and technical change, this paper explores e-commerce firm-level total factor productivity (TFP) change and its impact factors using DEA Malmquist models. It shows that total factor productivity growth is often attributed to technical progress, efficiency improvement, scale expansion or the combinations of these three factors, but technical change plays a more important role in driving total factor productivity (TFP) change. Specifically, B2C (business-to-customer) e-commerce firms have higher technical innovation ability and total factor productivity level than B2B (business-to-business) and OTA (online-travel-agent) e-commerce firms. Although OTA e-commerce firms achieve technical efficiency improvement and scale expansion, it is technical regression to lead to total factor productivity regression. B2B e-commerce firms' total factor productivity regression are more due to the combination of efficiency change, technical change and scale change. These findings enlighten managers should correctly handle the relationship of technology, scale and efficiency, and should improve total factor productivity (TFP) based on different e-business models.
The passenger flow assignment for urban rail transit is the basic theory to estimate the flow distribution in network. Firstly, this paper introduced the passenger flow assignment study with comparative analyzing the existing models' characteristics of the supply network framework modeling, passengers' travel behavior hypothesis and passenger flow assignment principle. Based on the classification of the assignment model, this paper focused on the network route searching algorithm, model solving algorithm and passengers' behavior simulation and distribution simulation algorithm in the assignment model. Finally, the passenger flow assignment research prospect under the new situation in China urban rail transit network operation is proposed.
To express relative development of attribute values in aggregation process, a new type of method, the ordered fractile weighted aggregation (OFWA) operator, is proposed. This type of operator is, especially, suitable for incentive questions. Its main characteristic is that the fractile variable is introduced to measure the degree of which the development of attribute values, and the incentive preference of decision makers is fused into aggregation neatly. By properties analysis, it is found that this operator is commutative, bounded, and monotonic under certain conditions. In addition, a numerical example is applied to the question of incentive evaluation, and it appears that this operator can enlarge or reduce the aggregations by letting the sum of weights not be 1. In this case the different among objects becomes distinct so as to realize the purpose of incentive.
For the hybrid multi-criteria decision making problems with uncertainty, an intuitionistic fuzzy decision-making approach based on prospect theory and evidential reasoning is proposed. First, three kinds of hybrid information including precision numbers, interval numbers and linguistic variables are unified by intuitionistic fuzzy numbers to retain the uncertainty of decision. Then, prospect decision matrix considering limited rationality of prospect theory is expressed by intuitionistic fuzzy number. Finally, prospect decision information integration using evidential reasoning approach is put forward to reduce the decision information miss in the calculation and to sort all the alternatives and select the best one. An illustrative example shows that the proposed method has rationality and feasibility.
For multi-attribute decision making problems multiple in interval type 2 trapezoidal fuzzy set (IT2TrFS) environment, this paper proposes a kind of ranking method based on the likelihood of risk preferences of decision makers. Firstly, decision makers are classified according to the risk preferences of decision makers and we propose measurement method of risk preferences of decision makers. Then, we obtain risk preferences decision matrix and define a new calculation formula of likelihood. Finally, we get the ranking results of alternatives by calculating the signed distance. Example analysis shows that the proposed method is scientific and reasonable and different risk preferences of decision makers have influence on the results of decision making. Comparison with previous methods shows that the proposed algorithm is more feasible:it is applicable for decision makers of both risk preferences and risk conservation.
According to the structure and application functions of information networks, the mechanism of high power microwave effects on electronic devices of information networks are analyzed, as well as the efficiency evaluation index system of the multi-hierarchy attributes of information networks is established. The idea that the interval-based cloud model is proposed which is the interval number combined with cloud model is used to achieve the combination between the qualitative and quantitative attributes of indexes, the interval weight of evaluation index is obtained by interval analytic hierarchy process, the total effectiveness of information networks is fitted by using evaluate level interval-based cloud model. The method is proved to be feasible and effective through example analysis. The results showed that combines the fuzziness and uncertainty factors in the course of evaluation by using interval number.
An isomorphic three-mode Bayesian network (BN) method is proposed based on the topology of reliability block diagram (RBD). It overcomes the disadvantages to transform a RBD to an equivalent three-layer BN in terms of large topology distinction and combinational explosion problems. There are two modes-normal and failure in traditional RBD and three-layer models, in this paper the failure mode is further divided into two new modes:physical failure and normal but not work. Accordingly, a three-mode BN model with normal, physical failure and normal but not work is formulated to replace the original two-mode BN model. Then the conditional probability table (CPT) of each node in the three-mode BN is analyzed in further. Eventually, the equivalent three-layer BN and isomorphic three-mode BN are built for the RBD of the navigation mission of an aircraft, the two BNs are calculated and analyzed. The results demonstrate that the new three-mode BN model, which can overcome the combinational explosion problem, is an effective method for system reliability analysis.
In this paper, several integral programming models that minimize resource peak are presented, respectively, under circumstance of activity cannot be split, can be split, and can be split specifically. The validity of the models is verified through examples and a case derived from real project. In addition, a sequential method solving multi-resource leveling problem is proposed. Compare with traditional models, the models proposed in this paper formulate resource peaks independently and need neither to determine the critical path, nor to compute floats of non-critical activities, therefore, have better flexibility to resource constraints and specific requirement to start or finish time of activities.
To design an efficient attribution reduction algorithm, firstly, rough equivalence class (REC) is proposed based on the smallest computational unit of global equivalences, REC based reduction is proved to be equivalent to that of the original information system. Then the properties of 1, 0, and -1-RECs are studied, and the positive region computation is converted to the incremental computation of based on bilateral decreasing of 0-REC, integrated with the transitivity of 1 and -1-RECs, principles of optimality are designed to delete entities bilaterally and horizontally and to delete attributes vertically, which can decrease computational domain in each round of computation, base on which the incremental computation method with multiple Hashing is designed; at last, the incremental core and attribute reduction algorithms are proposed. Core computation is based on the vertical principle of optimality, more than one non-core attributes can be obtained in one round computation, and so not all the attributes need traversal. Data sets of UCI, massive and ultra-high dimension are used to verify the algorithms, and the results prove that the algorithms are complete and efficient and have superiority in massive and ultra-high dimensional data sets especially.
Based on historical event records, probabilistic risk of natural disaster is traditionally calculated by coupling the probability distribution function of the intensity of risk source and the vulnerability function of risk bearing body. The probabilistic risk obtained in this way refers to the expected loss of one disaster event and does not contain the time factor, which is one of the basic three factors for describing natural disaster risk. In the circumstance, there exists systematic error in the traditional probabilistic risk. Therefore, this paper suggests a formal model to correct the systematic error by adding a time factor. Besides, by taking the comparison of typhoon risks between Zhejiang and Guangdong provinces as a case study, the application model for probabilistic typhoon risk is put forward by deriving from the formal model. And, the information diffusion techniques are used to solve the small sample problem. The results show that the conclusions for the comparison are different using the corrected and un-corrected risks, while the corrected one based on the suggested model is more reasonable.
Consider the current single peak sampling method isn't reasonable enough, this paper presents an improved flood frequency analysis approach through annual, semi-annual and quarter extreme value series. The quarter, semiannual, and annual extreme flow data are extracted from the daily flow records with the independence of adjacent peak value taken into account beforehand, assumed 7 days and 15 days as the minimum interval for two floods in sampling process, and make designed flood through the expanded data series. Via the testing of best probability distribution for these series, and calculate the designed value. The result showed, that for the given watershed, the designed values are smaller than the ones got from traditional method in short recurrence interval as 5 years, 10 years, 20 years and 50 years; while for flood return period higher than 100 years, the regularity is just opposite. For the designed value with 500 years return period, the percentage error between quarter sampling and semi-annual sampling method reaches 30%, this is 2.2% between half year sampling and annuls maximum sampling method. This approach is able to provide more precise designed value for different standard of design flood; it makes some improvement on the aspect of calculate accuracy and safety for traditional method.