Saturday, December 6, 2025

Alpha and cost containment - The value of AI

 


We have written about how hedge funds are trying to contain costs by trading more efficiently. We are also seeing cost containment and efficiencies through the use of AI. Similar to consulting, AI can make analysts more efficient at some of their core tasks through summarizing and sifting through data in reports. The use of AI through EDGAR filing is not new, but has become a core part of the work by both discretionary and quantitative researchers. 

AI is being used as:

  • information summary tool
  • focused search tool 
  • quick news analysis tool
  • pre-screening tool along with quant analysis 
  • simple idea generator
  • proprietary prompt tool 
While not a research replacement, AI is not a research adjunct that allows hedge funds to run leaner shops with less costs on junior analyst development. The objective is to make senior analysts more efficient by reducing drudgery. Many firms have spent money on proprietary prompt libraries that can be applied to stock sets to serve as an alternative filtering mechanism. This can be especially powerful when linked with proprietary databases.




Alpha and cost containment - traidng costs

 


Hedge fund performance centers on alpha generation. Alpha can come in many forms, but one that is clearly dominating the attention of many firms is cost containment. The drive to cost containment is based on two key components. One, there is relentless pressure from institutional investors to cut fees. With fees always pushing downward, firms have to become more efficient. Second, as hedge funds increase their trading volume, transaction costs become an increasingly important area for potential value creation.

By cutting down the cost of trading, there is an immediate gain in return that flows through to the bottom line, reducing performance and incentive fees. Lowering the bid-ask spread improves returns. Executing with less slippage again enhances performance. The gain from cost containment is generally immediate and does not have to wait until ideas embedded in trades generate returns. 

Cost containment is especially valuable to firms that are gaining scale. There can be specialized trading desks, centralized research, and risk management that can use economies of scale. All provide an edge that will squeeze out smaller firms that cannot gain economies of scale.

Monday, December 1, 2025

The great equity reset - global dispersion

 


There is no question that this has been a great year for Communication Services and the Information Technology sector, which are up 34.88% and 24.36%, respectively, for the year through November. Still, we are seeing cracks in these sectors with differentiation across the Mag 7. What remains the key theme to watch is the rotation into global equities and emerging markets, which are up respectively by 29.84% and 22.40% through November. These indices are beating large, mid, and small-cap US stocks by a significant margin.  

Buying a broad set of US stocks is not the direction for success in the equity markets. We are seeing low correlation across US stocks and high dispersion. Investors need to be selective with their stock choices. This is a global stock-pickers market. If you are not a stock picker, you can see it in the differentials across risk premia. High beta and momentum factors are showing strong returns, while low volatility and dividend stocks are underperforming, even amid the current talk of market bubbles. 

Friday, November 28, 2025

CME Outage, Quant Models, and Prices

 



Prices are the lifeblood of any quant model. If you have the wrong input, you will get incorrect predictions. In the case of an outlier, a single incorrect price may generate misleading signals about future opportunities, so any model should be thoroughly reviewed to assess its sensitivity to wayward signals. 


If there are incorrect inputs, the model output should be adjusted to reflect the change in data when a replacement is made. Users have to be alerted to any changes. These are easy cases to deal with. There are also more difficult issues, such as data oddities or anomalies. 


For example, the CME outage during the Thanksgiving period is a market anomaly that has to be addressed, especially given that it occurred at month-end. To provide context, there was an 11-hour system outage ending at 1335 GMT. It was during the Thanksgiving break, which is associated with low trading volume in the US, but it impacted all global markets on a Friday, a month-end. An outage will lead to a change in trading and a surge upon reopening. Hence, the inputs for open, high, low, and close will be distorted from what they would be in the absence of an outage. This will lead to slightly different signals generated from any model. 


So what should a modeler do about this? One response is that the price is the price and to do nothing. It is reasonable and defensible, yet it may seem odd not to account for some distortion; however, there is no way to determine the impact of any outage. What would be the right price? Another option is to drop the price to the last close. This can be defended, but replacing data seems somewhat arbitrary. 


The best response is to focus on the output and look at the marginal trade signals generated. Does it matter? An output sensitivity analysis can be conducted to see what happens with the new prices, rather than looking at what would have happened if no change had been made. If there are small marginal changes, then keep the latest prices. If there is a large set of new signals, investigate further on why and flag the changes. The prices can be kept, but the flagged trades can be ignored. However, this creates another set of problems if the trade is closing an existing position. When do you exit the old position?


These real-life problems tell the user that there is no such thing as a fully automated system.