Title: Topics in Optimization via Deep Neural Networks
Topic: Ömer Ekmekçioğlu Tez Sunumu
Time: Jun 29, 2022 11:00 AM Istanbul
Join Zoom Meeting
https://zoom.us/j/6547746234?pwd=ZENZNWtCbUlQRjVMMVFneWtxZGlzZz09
Meeting ID: 654 774 6234
Passcode: 478379
Title: Topics in Optimization via Deep Neural Networks
Abstract: We present two studies in the intersection of deep learning and optimization, Deep Portfolio Optimization, and Subset Based Error Recovery. Along with the emergence of deep models in finance, the portfolio optimization trend had shifted towards data-driven models from the classical model-based approaches. However, the deep portfolio models generally suffer from the non-stationary nature of the data and the results obtained are not always very stable. To address this issue, we propose to use Graph Neural Networks (GNN) which allows us to incorporate graphical knowledge to increase the stability of the models in order to improve the results obtained in comparison to the state-of-the-art recurrent architectures. Furthermore, we analyze the algorithmic risk-return trade-off for the deep portfolio optimization models to give insights on risk for the fully data-driven models.
We also propose a data denoising method using Extreme Learning Machine
(ELM) structure which allows us to use Johnson-Lindenstrauß Lemma (JL) for preserving Restricted Isometry Property (RIP) in order to give theoretical guarantees for recovery. Furthermore, we show that the method is equivalent to a robust two-layer ELM that implicitly benefits from the proposed denoising algorithm. Current robust ELM methods in the literature involve well-studied L1, L2 regularization techniques as well as the usage of the robust loss functions such as Huber Loss. We extend the recent analysis on the Robust Regression literature to be effectively used in more general, non-linear settings and to be compatible with any ML algorithm such as Neural Networks (NN). These methods are useful under the scenario where the observations suffer from
the effect of heavy noise. We extend the usage of ELM as a general data
denoising method independent of the ML algorithm. Tests for denoising and regularized ELM methods are conducted on both synthetic and real data. Our method performs better than its competitors for most of the scenarios, and successfully eliminates most of the noise.