IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2502.01495.html
   My bibliography  Save this paper

Supervised Similarity for High-Yield Corporate Bonds with Quantum Cognition Machine Learning

Author

Listed:
  • Joshua Rosaler
  • Luca Candelori
  • Vahagn Kirakosyan
  • Kharen Musaelian
  • Ryan Samson
  • Martin T. Wells
  • Dhagash Mehta
  • Stefano Pasquali

Abstract

We investigate the application of quantum cognition machine learning (QCML), a novel paradigm for both supervised and unsupervised learning tasks rooted in the mathematical formalism of quantum theory, to distance metric learning in corporate bond markets. Compared to equities, corporate bonds are relatively illiquid and both trade and quote data in these securities are relatively sparse. Thus, a measure of distance/similarity among corporate bonds is particularly useful for a variety of practical applications in the trading of illiquid bonds, including the identification of similar tradable alternatives, pricing securities with relatively few recent quotes or trades, and explaining the predictions and performance of ML models based on their training data. Previous research has explored supervised similarity learning based on classical tree-based models in this context; here, we explore the application of the QCML paradigm for supervised distance metric learning in the same context, showing that it outperforms classical tree-based models in high-yield (HY) markets, while giving comparable or better performance (depending on the evaluation metric) in investment grade (IG) markets.

Suggested Citation

  • Joshua Rosaler & Luca Candelori & Vahagn Kirakosyan & Kharen Musaelian & Ryan Samson & Martin T. Wells & Dhagash Mehta & Stefano Pasquali, 2025. "Supervised Similarity for High-Yield Corporate Bonds with Quantum Cognition Machine Learning," Papers 2502.01495, arXiv.org.
  • Handle: RePEc:arx:papers:2502.01495
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2502.01495
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Mingshu Li & Bhaskarjit Sarmah & Dhruv Desai & Joshua Rosaler & Snigdha Bhagat & Philip Sommer & Dhagash Mehta, 2024. "Quantile Regression using Random Forest Proximities," Papers 2408.02355, arXiv.org.
    2. Preetha Saha & Jingrao Lyu & Dhruv Desai & Rishab Chauhan & Jerinsh Jeyapaulraj & Philip Sommer & Dhagash Mehta, 2024. "Machine Learning-based Relative Valuation of Municipal Bonds," Papers 2408.02273, arXiv.org.
    3. Lin, Yi & Jeon, Yongho, 2006. "Random Forests and Adaptive Nearest Neighbors," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 578-590, June.
    4. Nathalia Castellanos & Dhruv Desai & Sebastian Frank & Stefano Pasquali & Dhagash Mehta, 2024. "Can an unsupervised clustering algorithm reproduce a categorization system?," Papers 2408.10340, arXiv.org.
    5. Dhruv Desai & Ashmita Dhiman & Tushar Sharma & Deepika Sharma & Dhagash Mehta & Stefano Pasquali, 2023. "Quantifying Outlierness of Funds from their Categories using Supervised Similarity," Papers 2308.06882, arXiv.org.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gregory Yampolsky & Dhruv Desai & Mingshu Li & Stefano Pasquali & Dhagash Mehta, 2024. "Case-based Explainability for Random Forest: Prototypes, Critics, Counter-factuals and Semi-factuals," Papers 2408.06679, arXiv.org.
    2. Philippe Goulet Coulombe & Maximilian Goebel & Karin Klieber, 2024. "Dual Interpretation of Machine Learning Forecasts," Papers 2412.13076, arXiv.org.
    3. Ke-Lin Du & Rengong Zhang & Bingchun Jiang & Jie Zeng & Jiabin Lu, 2025. "Foundations and Innovations in Data Fusion and Ensemble Learning for Effective Consensus," Mathematics, MDPI, vol. 13(4), pages 1-49, February.
    4. Goldstein Benjamin A & Polley Eric C & Briggs Farren B. S., 2011. "Random Forests for Genetic Association Studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 10(1), pages 1-34, July.
    5. Borup, Daniel & Christensen, Bent Jesper & Mühlbach, Nicolaj Søndergaard & Nielsen, Mikkel Slot, 2023. "Targeting predictors in random forest regression," International Journal of Forecasting, Elsevier, vol. 39(2), pages 841-868.
    6. Jerinsh Jeyapaulraj & Dhruv Desai & Peter Chu & Dhagash Mehta & Stefano Pasquali & Philip Sommer, 2022. "Supervised similarity learning for corporate bonds using Random Forest proximities," Papers 2207.04368, arXiv.org, revised Oct 2022.
    7. Jared S Laufenberg & Joseph D Clark & Richard B Chandler, 2018. "Estimating population extinction thresholds with categorical classification trees for Louisiana black bears," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-12, January.
    8. Sexton, Joseph & Laake, Petter, 2009. "Standard errors for bagged and random forest estimators," Computational Statistics & Data Analysis, Elsevier, vol. 53(3), pages 801-811, January.
    9. Joshua Rosaler & Dhruv Desai & Bhaskarjit Sarmah & Dimitrios Vamvourellis & Deran Onay & Dhagash Mehta & Stefano Pasquali, 2023. "Enhanced Local Explainability and Trust Scores with Random Forest Proximities," Papers 2310.12428, arXiv.org, revised Aug 2024.
    10. David M. Ritzwoller & Vasilis Syrgkanis, 2024. "Simultaneous Inference for Local Structural Parameters with Random Forests," Papers 2405.07860, arXiv.org, revised Sep 2024.
    11. Mendez, Guillermo & Lohr, Sharon, 2011. "Estimating residual variance in random forest regression," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2937-2950, November.
    12. Li, Yiliang & Bai, Xiwen & Wang, Qi & Ma, Zhongjun, 2022. "A big data approach to cargo type prediction and its implications for oil trade estimation," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 165(C).
    13. Yi Fu & Shuai Cao & Tao Pang, 2020. "A Sustainable Quantitative Stock Selection Strategy Based on Dynamic Factor Adjustment," Sustainability, MDPI, vol. 12(10), pages 1-12, May.
    14. Ishwaran, Hemant & Kogalur, Udaya B., 2010. "Consistency of random survival forests," Statistics & Probability Letters, Elsevier, vol. 80(13-14), pages 1056-1064, July.
    15. José María Sarabia & Faustino Prieto & Vanesa Jordá & Stefan Sperlich, 2020. "A Note on Combining Machine Learning with Statistical Modeling for Financial Data Analysis," Risks, MDPI, vol. 8(2), pages 1-14, April.
    16. Biau, Gérard & Devroye, Luc, 2010. "On the layered nearest neighbour estimate, the bagged nearest neighbour estimate and the random forest method in regression and classification," Journal of Multivariate Analysis, Elsevier, vol. 101(10), pages 2499-2518, November.
    17. Olivier BIAU & Angela D´ELIA, 2010. "Euro Area GDP Forecast Using Large Survey Dataset - A Random Forest Approach," EcoMod2010 259600029, EcoMod.
    18. Cleridy E. Lennert‐Cody & Richard A. Berk, 2007. "Statistical learning procedures for monitoring regulatory compliance: an application to fisheries data," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 671-689, July.
    19. Ruoqing Zhu & Donglin Zeng & Michael R. Kosorok, 2015. "Reinforcement Learning Trees," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(512), pages 1770-1784, December.
    20. Susan Athey & Julie Tibshirani & Stefan Wager, 2016. "Generalized Random Forests," Papers 1610.01271, arXiv.org, revised Apr 2018.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2502.01495. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.