IPO underpricing algorithm

IPO underpricing is the increase in stock value from the initial offering price to the first-day closing price. Many believe that underpriced IPOs leave money on the table for corporations, but some believe that underpricing is inevitable. Investors state that underpricing signals high interest to the market which increases the demand. On the other hand, overpriced stocks will drop long-term as the price stabilizes so underpricing may keep the issuers safe from investor litigation.

IPO underpricing algorithms

Underwriters and investors and corporations going for an initial public offering (IPO), issuers, are interested in their market value. There is always tension that results since the underwriters want to keep the price low while the companies want a high IPO price.

Underpricing may also be caused by investor over-reaction causing spikes on the initial days of trading. The IPO pricing process is similar to pricing new and unique products where there is sparse data on market demand, product acceptance, or competitive response. Thus it is difficult to determine a clear price which is compounded by the different goals issuers and investors have.

The problem with developing algorithms to determine underpricing is dealing with noisy, complex, and unordered data sets. Additionally, people, environment, and various environmental conditions introduce irregularities in the data. To resolve these issues, researchers have found various techniques from artificial intelligence that normalizes the data.

Artificial neural network

Artificial neural networks (ANNs) resolves these issues by scanning the data to develop internal representations of the relationship between the data. By determining the relationship over time, ANNs are more responsive and adaptive to structural changes in the data. There are two models for ANNs: supervised learning and unsupervised learning.

In supervised learning models, there are tests that are needed to pass to reduce mistakes. Usually, when mistakes are encountered i.e. test output does not match test input, the algorithms use back propagation to fix mistakes. Whereas in unsupervised learning models, the input is classified based on which problems need to be resolved.

For example, Chou[1] discusses their algorithm for determining the IPO price of Baidu. They have a three layer algorithm which contains—input level, hidden level, and output level:

They reduce the amount of errors by trying to find the best route and weight through the neural network which is an evolutionary algorithm.

Evolutionary models

Evolutionary programming is often paired with other algorithms e.g. ANN to improve the robustness, reliability, and adaptability. Evolutionary models reduce error rates by allowing the numerical values to change within the fixed structure of the program. Designers provide their algorithms the variables, they then provide training data to help the program generate rules defined in the input space that make a prediction in the output variable space.

In this approach, the solution is made an individual and the population is made of alternatives. However, the outliers cause the individuals to act unexpectedly as they try to create rules to explain the whole set.

Rule-based system

For example, Quintana[2] first abstracts a model with 7 major variables. The rules evolved from the Evolutionary Computation system developed at Michigan and Pittsburgh:

Quintana uses these factors as signals that investors focus on. The algorithm his team explains shows how a prediction with a high-degree of confidence is possible with just a subset of the data.

Two-layered evolutionary forecasting

Luque[3] approaches the problem with outliers by performing linear regressions over the set of data points (input, output). The algorithm deals with the data by allocating regions for noisy data. The scheme has the advantage of isolating noisy patterns which reduces the effect outliers have on the rule-generation system. The algorithm can come back later to understand if the isolated data sets influence the general data. Finally, the worst results from the algorithm outperformed all other algorithms' predictive abilities.

Agent-based modelling

Currently, many of the algorithms assume homogeneous and rational behavior among investors. However, there's an approach alternative to financial modeling, and it's called agent-based modelling (ABM). ABM uses different autonomous agents whose behavior evolves endogenously which lead to complicated system dynamics that are sometimes impossible to predict from the properties of individual agents.[4] ABM is starting to be applied to computational finance. Though, for ABM to be more accurate, better models for rule-generation need to be developed.

References

  1. Chou, Shi-Hao; Yen-Sen Ni; William T. Lin (2010). "Forecasting IPO price using GA and ANN simulation". In Proceedings of the 10th WSEAS international conference on Signal processing, computational geometry and artificial vision (ISCGAV'10). World Scientific and Engineering Academy and Society (WSEAS): 145–150.
  2. Quintana, David; Cristóbal Luque; Pedro Isasi (2005). "Evolutionary rule-based system for IPO underpricing prediction". In Proceedings of the 2005 conference on Genetic and evolutionary computation (GECCO '05): 983–989.
  3. Luque, Cristóbal; David Quintana; J. M. Valls; Pedro Isasi (2009). "Two-layered evolutionary forecasting for IPO underpricing". In Proceedings of the Eleventh conference on Congress on Evolutionary Computation (CEC'09). Piscatawy, NJ, USA: IEEE Press: 2384–2378.
  4. Brabazon, Anthony; Jiang Dang; Ian Dempsy; Michael O'Neill; David M. Edelman (2010). "Natural Computing in finance: a review" (PDF). Handbook of Natural Computing.
This article is issued from Wikipedia - version of the 7/10/2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.