首页 > 范文大全 > 正文

Forecasting the Advertising investment Risk of Sporting Goods Based on Optimized

开篇:润墨网以专业的文秘视角,为您筛选了一篇Forecasting the Advertising investment Risk of Sporting Goods Based on Optimized范文,如需获取更多写作素材,在线客服老师一对一协助。欢迎您的阅读与分享!

Abstract. forecasting The advertising investment risk of sporting goods is very important which can provide the decision support for top manager. In this paper, we presented an optimized support vector machine (OSVM) to predict Advertising investment risk of Sporting goods. Experimental results show that the prediction accuracy improved by the proposed method.

Key words: Advertising investment Risk, Forecasting, Support Vector Machine, Paritcle Swarm Optimization, Generalized Pattern Search.

1. Introduction

Forecasting The Advertising investment risk of Sporting goods play a very important role in company operation, which can provide a decision support for top manager. There are many Forecasting methods proposed in past decades. Paper [1] evaluated Advertising investment risk early-warning with risk fuzzy accurate back propagation evaluation model, which can solve nonlinear and uncertain problem. Paper [2] presented a reliability risk assessment model based on fuzzy FMEA, and analyzed the potential fault modes, which can be taken in prevention stage to avoid finding some production and market problems. Paper [3] constructed a Advertising investment risk evaluation index system with the grey analytic hierarchy process method. Paper [4] evaluated enterprise Advertising investment risk with fuzzy analysis and analytic hierarchy process. Paper [5] considered it is very improtant to accurately predict the outcome of a process or the future state of a system, and predict outcomes in emerging environments in Advertising investment with support vector machine. However, in order to use SVM, pepole should determine several paramentes, which hingers accurate prediction results.

To obtain a good results for SVM, researchers have proposed many algorithms to optimize several paramentes. These methods include genetic algorithms (GA), artificial bee colony algorithm (ABCA), chaotic ant swarm algorithm (CASA), and quantum particle swarm optimization (QPSO). Paritcle swarm optimization (PSO) is a new research method in the field of swarm intelligence recently, and have some advantages such as global search capability. But it is easy to fall into local optimization and having the low speed of convergence in the late. Generalized pattern search (GPS) have excellent local search ability, but its search result mostly depends on the initial location.

In this paper, we propose an optimized support vector machine (OSVM) to predicte Advertising investment risk of Sporting goods. Firstly, combining the advantages and disadvantages of PSO and GPS, we present an optimized support vector machine. Secondly, we predicte Advertising investment risk of Sporting goods with OSVM. Lastely, the results show the effectiveness of our method.

The paper is organized as follows. Section 2 describes SVM algorithm. In section 3, we introduce particle swarm optimization (PSO) and generalized pattern search (GPS), and then prensent optimized support vector machines (OSVM). Section 4 present our experimental results of parameters srarch using PSO and GPS.

2. Support Vector Machines (SVM)

Recently, The support vector machine method is to establish a learning theory of VC dimension theory and structural risk minimization principle on the basis of statistical, according to the limited sample information in the model complexity (i.e. specific training samples, learning accuracy) and learning (i.e. error-free samples to identify any capacity.) to find the best compromise between, in order to get the best generalization ability. Support vector machine vector is mapped to a higher dimensional space, in this space has a maximum margin hyperplane. In a separate data of the hyperplane built on both sides there are two parallel hyperplanes. Establishing direction suitable separating hyperplane that maximizes the two parallel with the hyperplane and the distance. The key to SVM is the kernel function. Usually it is difficult to classify the low dimensional space vector set, the solution is to be mapped into a high dimensional space. But this difficulty is increasing the complexity of calculation, and the kernel function is cleverly solved the problem. That is to say, as long as the choice of the appropriate kernel function, classification function can get high dimensional space. In the theory of SVM, using different kernel function will lead to different SVM algorithms.

Let there be n labeled training samples (xi , yi), where each xi is a vector of dimensionality d, and each label yi depend on the class of the sample xi.

In the binary classification problem, the decision function of SVM is

where is a mapping of sample x from the input space to a high dimensional feature space.

The optimal value of w and b are determined by solving the following optimization problem:

where denotes the l2 norm and c is a regularization constant.

The above formulation can transform the equivalent dual optimization problem:

where .

In this paper, we select the kernel functon satifying the Mercer condition:

3. Optimized Support Vector Machines (OSVM)

In this paper, in order to optimizae the parameters of SVM, combineing the advantages and disadvantages of PSO and GPS, we presented an optimized support vector machine to predicte Advertising investment risk of Sporting goods.

3.1. Particle swarm optimization (PSO)

PSO is initialized to a group of random particle (random solution), and then through the iterations to find the optimal solution, in every iteration, the particles by tracking two "extreme" to update their own. The first is the optimal particle itself to find the solution, the solution is called pbest pBest, another extreme is the optimal solution of the whole population is found, the best value is a global extremum gBest. Also can not the whole population with just one part of the optimal particle neighbor, so extreme in all neighbors is a local extremum.PSO have some advantages such as global search capability. However, it is easy to fall into local optimization and having the low speed of convergence in the late.

At each iteration t, the positon of the ith particle is updated by a velocity . The position is updated for the next iteration using

where denotes the position of particle i in the demension j search space at time step t. The position of the paritcle is changed by adding a velocity to the current position. The velocity updated rule is calcluated as

where w represents the inertia weigth. is the velocity of particle i in dimension at time step t. represents the best position of the particle in its flying coures. represents the best position of the whole flock. c1 and c2 are learning factors. c1 expressed the confidence of a particle in itself, and c2 expresses the confidence of a paritcle in its nerghbors.

The velocity updated rule depends on the current velocity , the weighted difference vectors and . (1) The global best position at time step t is the best position discovered by all particles found since the first time step. (2) The r1 and r2 are random values in the range [0,1] with a stochastic element to the algorithm. (3) The c1 and c2 are positive acceleration coefficients used to scale the contribution of the cognitive and social components, respectively. (4) Particles gain strength by cooperation and are most effective when c1 and c2 are well-balanced. (5) For , particle decelerate until their velocity is zero. For , velocity increase over time, accelerated to maximun velocity, and the swarm diverger.

The maximun velocity is calcultated as a fraction of the distance between the bounds of the search space as follows

To improve PSO converergence, the Eq.(6) updated with constriction coefficient is given as

where

with

Conditions and ensure that the swarm converger. Paramenter k controls wsarm exploration and exploitaion abilities.

Usually, the c1 and c2 are static, their optimzed values are found empiriclly. In fact, the ratio between c1 and c2 coefficients is problem dependent. To deal with this problem, the c1 and c2 at time step t update is given as

where is the maximun number of time steps.

The PSO have some advageit such as global search capability. However, PSO has the shortingcoming of local optimal solution. In order to deal with this problem existing in PSO algorithm, GPS algorithm is proposed.

3.2. Generalized pattern search (GPS)

GPS method is an efficient search method, which firstly introduced and analyzed for unconstrained minimization problems, and then extended to problems with bound and general linear constraints by Lewis and Torczon. A summary of the work on GPS method can be found in. GPS is a direct search method for solving optimization problems, it is not the objective function and the constraint function derivative information and the only function value information, is an effective method for solving optimization problems or not by derivation high cost. The possibility of implementation of the algorithm is iterative sequence generated by the pattern search algorithm in a mobile has a certain size of the integer lattice, requirements on the acceptance of the step which can relax the classical algorithm, and by strengthening the condition in step form to achieve convergence objective.

Starting from an initial guess and an initial value of the step length control parameter, the GPS mehtod search to generate a sequence of iterates such that , where f is a objective function, and is an improved mesh point. When the search step fails to provide an improved mesh point, then the poll step is to request. The poll step consists in evaluating the objective function f at the meighboring mesh points, and then judge a lower objctive function can be found or not. If the poll step cannot find an improved mesh point, the current solution is decided as a local mesh optimizer. The mech size parameter is refined as follows:

where , and is a real mumber that remains constant over all search process.

The mesh is centered at and is defined by a finite set of directons D, which positively span . Denoting as is the cardinality of D, the mesh is given by

where Z is the set of nonnegative integers.

3.3. Optimized Support Vector Machines (OSVM)

The process of selecting the parameters (c,) of SVM by PSO combined with GPS is described as following:

Step 1 Randomly initialize the particles with (c,) in PSO.

Step 2 Memorize each individual the position vector and velocity vector which the individual has acquired the best fitness.

Step 3 Note the fitness value and update the position and velocity of particles with Eq.(13) and Eq.(14).

Step 4 Definiion of the starting point and other parameters of GPS accoring the results of Step 3.

Step 5 Evaluate the fitness at the mesh point , .

Step 6 Compare the fitness at the current and the mesh point Mj, if one of the Mj is fitter than the , then expand the patter vector with , and build a new mesh on this current point Mj with , and then go to Step 5.

Step 7 If the is fitter than the Mj, judge the stopping criteria is met or not. If the stopping criteria is satiffied, then updating with , and its fitness value are outputted. Otherwise, follow the next step.

Step 8 Add the patter vector to the current vector to update the current point, and build a new mesh on this current point, and then go to Step 5.

Step 9 Stop the GPS algorithm when any of the following conditions occurs: (1) the mesh size is less than mesh tolerance. (2) the number of iterations is more than a predefined value. (3) the change in the objective funtion is less than a function tolerance.

4. Forecasting The Advertising investment

Risk of Sporting goods

In the study, the data of a certain Sporting goods company between 2001 and 2011 are applied to study the brand Advertising investment risk prediction ability with the optimized SVM. The related factors are following: product customer risk, supplier risk, brand market competition risk, similar competitors risk, sales risk, Advertising investment risk management, advertisement investment risk, brand building and maintenance risk, brand extension risk, and brand management risk. This related factors is normalized. The risk state for each month is categorized as “+1” and “-1”, where “+1” means that the Sporting goods company don’t have brand Advertising investment risk, and “-1” means that the Sporting goods company have brand Advertising investment risk.

To verify the effectiveness of the proposed method, we selected two-thirds of data for train, and the rest are selected for testing. We predicte Advertising investment risk of Sporting goods with OSVM. Meanwhile, we conduct some experiment with PSO-SVM (optimiezd SVM with PSO), and GPS-SVM (optimiezd SVM with GPS). The algorithms are realized by MATLAB with Windows XP SP3, 2.6GHz CPU. c1, max=2.5, c1, min=0.5, c2, max=2.5, c2, min=0.5. The experimental results are summarized in Table 1.

Table 1. Experiment results of Advertising investment risk prediction

Table 1 gives the comparison of study time and accuracy of Advertising investment risk prediction of Sporting goods among the PSO-SVM, the GPS-SVM and the OSVM, which indicates that the prediction results of the OSVM have the best Forecasting effects among the PSO-SVM, the GPS-SVM and the OSVM, and the GPS-SVM have a higher prediction accuracy than PSO-SVM. The proposed method takes shorter time but offers higher precision, which have satisfactory results with accuracy over 96%. The accuracy of OSVM is 2.24%, 1.65% higher respectively than the results of PSO-SVM, GPS-SVM. It is beause that OSVM combines the advantages and disadvantages of PSO and GPS. OSVM can more effecttively optimized SVM than PSO-SVM and GPS-SVM, which can easily find the proper parameters of the SVM. The OSVM has a good application prospect in the Advertising investment risk prediction of Sporting goods.

5. Conclusion

In this paper, we firstly proposed an optimized SVM, and then predicted Advertising investment risk of Sporting goods with the optimized support vector machine. The main contributions of the paper are to opimize the parameters of SVM combineing the advantages and disadvantages of PSO and GPS, where the global searching ability of the PSO is enchanced, and the local searching ability of the PSO is holded. The experiments results show that the proposed method to predicte Advertising investment risk of Sporting goods is promising.

References

[1] Jun Li, Ming Ren, Yunqi Zhang. "Research on Advertising investment Risk Early-Warning System Based on FA-BP Evaluation Model", In Proceeding of the International Conference on Management of Technology, pp, 148-152, 2007.

[2] Wenqing Zhao, Zhouxin qian. "Comprehensive Evaluation of Advertising investment Risk Based on GAHP". Journal of Anhui University of Technology, vol. 23, no. 1, pp. 101-105, 2006.

[3] Dapeng Cui, David Curry. "Prediction in Advertising investment Using the Support Vector Machine", Advertising investment Science, vol. 24, no. 4, pp. 595-615, 2005.

[4] Ming Yu, Yueqiao Ai. "SVM Parameters Optimization based on Artificial Bee Colony Algorithm and tts Application in Handwriting Verification". In Proceeding of 2011 International Conference on Electrical and Control Engineering, pp, 5026-5029, 2011.

[5] Yuyan Ren, Jie Bao, Ming Su, Hongrui Wang. "Application of SVM Based on Improved Quantum Particle Swarm Optimization in Bio-mimetic Robotic Horse". Advanced Materials Research, vol. 204-210, pp. 306-309, 2011.