Insights
We love to solve hard problems. Sharing ideas is integral to how we do it.
Whether an idea directly applies to our work or is simply interesting, we voraciously consume cutting-edge research—and contribute thinking of our own.
- Markets & Economy
Share Buybacks: A Brief Investigation
We examine whether the rise in stock buybacks has artificially propped up equity prices, suppressed market volatility, and weakened corporate balance sheets.
- Data Science
Interpretability Methods in Machine Learning: A Brief Survey
A Two Sigma AI engineer outlines several approaches for understanding how machine learning models arrive at the answers they do.
- Data Science
- Policy
AI Past and Future: A Conversation with David Siegel and Kai-Fu Lee
Speaking at the 2019 Milken Institute Global Conference, Two Sigma co-founder David Siegel discusses the challenges and opportunities AI offers for individuals, companies, and societies.
- Markets & Economy
Two Sigma Factor Lens: Forecasting Factor Returns
We introduce a new paper proposing a methodology for using historical data to quantify the return premia for major asset-class based factors.
- Markets & Economy
Thematic Research: Forecasting Factor Returns
A proposed methodology for using historical data to quantify the return premia for major asset-class based factors.
- Technology
Agile Cloud Security
Two Sigma engineers explore key challenges and opportunities they encountered while systematically rebuilding cloud security processes in an automated, agile manner.
- Policy
- Technology
David Siegel on the “Rules for the AI Race”: a WEF Roundtable
Speaking on a panel at the 2019 World Economic Forum, Two Sigma co-founder David Siegel discusses key challenges and opportunities as computers assume greater decision-making power globally.
- Career Journeys
- Technology
Building a High-Throughput Metrics System Using Open Source Software
A Two Sigma engineer shares key lessons learned while building a high-performance metrics system based on customized open source building blocks.
- Data Science
Gradient Sparsification for Communication-Efficient Distributed Optimization
Modern large-scale ML applications require stochastic optimization algorithms to be implemented on distributed computational architectures. A key bottleneck is the communication overhead for exchanging information such as stochastic gradients among different workers. In this paper, to reduce the communication cost, we propose a convex optimization formulation to minimize the coding length of stochastic gradients.