Regime change isn't just about market behaviour
In this video with Kiril Sokoloff (of 13D Research, the independent institutional investor research firm), famed investor Stanley Druckenmiller talks about his investing success (30% annual average returns over 30 years, only 5 down quarters across that time) and his views on everything from the use of algorithms and the conundrum of the US government deciding to raise rates or print more money (quantitative easing).
It’s a great sneak peak into the mind of this investing Jedi master but I want to hone in on the part where Druckenmiller mentions algorithms in investing. He states he has no clear view on where the use of algorithms is leading to and that its been a problem for investors like him who have traditionally traded on news and price action (in combination with his top-down macro view of markets). He sees a lot of the factors and schemes that algorithms generate as positive signals are not aligning with what ends up happening in the market (citing the outperformance of the Pharmaceuticals sector in the US this year as an example).
One of the key reasons for this issue is that algorithms are often built on historical models but when markets go through regime change every 10 to 15 years then the factors of these models can become inappropriate. Despite this issue, the algo’s can tend to outperform versus traditional non-algo using money managers (because money managers have traditionally been so bad - especially when you look at the battle between active versus passive investing).
An ultimate solution however does not appear to be man or machine (in a man vs machine world) but rather man (and his intuition) in conjunction with the machine. He also mentions the work done on this by famous chess Grandmaster Garry Kasparov who I saw speak about this topic earlier this year at a technology conference.
In a world where AI and algorithms are continually put front and centre of corporate strategies (mentions of artificial intelligence in US corporate earnings calls was up 25% quarter over quarter in Q3 2017 to 791 times) the risk of buying into “hype” over “substance” can be all too real.
There’s a similar story when it comes to the institutional banking space and having experienced that world over the last 4 years I can safely say I’ve also seen the hype outweigh the substance and this couldn't be clearer than when we look at how technology is continually underutilised in this industry. Just like the markets going through regime changes every few years so too should we look at how analysis is being done in this industry. There's certainly room for regime change there and below I show a few example for both sell side and buy side participants.
Key areas where problems are and technology can help investment banking
Sell side example problem
It often takes time to pull together the data and then transform that into the tables and charts to support a particular research point of view.
Multiplying this out over a number of research reports and adding bespoke research to the mix, the typical equity research analyst can find themselves spending more time on wrangling data than actual analysis work.
This is often mitigated by researchers spending long hours at the office and working weekends to put out content but the risk of burnout and lack of quality time away from the office can be detrimental.
Sell side example solution
Solving this problem can be done in a number of ways and the branches of this solution depend highly on each research firms specific nuances.
An example that could be deployed involves the use of what have been traditionally called business intelligence/dashboard/reporting tools. The reason you use these is that you can automate many of the repeatable processes that go into pulling together, manipulating and transforming data to go into tables, charts and other graphics for your reports.
I used a mixture of QlikView, Qlik Sense and Power BI to automate parts of my data wrangling processes and it saved me many hours of time compared to not having these tools at my disposal.
A solution like this takes us part of the way there. When some new data comes out or we want to get an update what’s happening in the markets I can click on a few buttons and kick start the process of updating the database of data and various reports I have that run on the back of it.
However, when pulling together reports, the user still needs to navigate the dashboard tool they’re using and manually pull together charts and tables they need. Even this process can be automated.
You might ask why such a task should even be considered for further automation (haven't we done enough) but when you have analysts who write a couple of research reports each day with nuanced narratives based on particular market views, the process of getting the right tables and charts adds up to valuable time.
A lot of this part of the process could actually be automated. Using skilled web developers (or even off the shelf tools) the charts and tables required in a particular report could be pulled together. The narratives written by equity researchers could also be pulled together by automated commentary tools like Yseop (which we used as part of a proof of concept project at Canaccord).
These automated commentary tools require some programming but have been deployed in a number of industries including sports (see Yahoo Sports reports) and insurance and wealth management (where portfolio reports are the typical example case). Granted, the wording can tend to sound repetitive (even with these tools deploying some variability to the comments) but the fact that it can be pulled together and performed over an entire database of stocks (at Canaccord we had nearly 12,000) means that a new report can be written on each stock every single day and this is much more output than analysts could ever produce. To add value, equity researchers could then go in over the top and edit these pre-built reports to add their own flavour to the research.
Pulling all of this together means that analysts can spend even less time on having to pull data together (especially for repetitive report styles) and spend more of their time on finding new investment ideas to analyse.
Buy side example problem
The buy side isn’t immune from this lack of technological integration either and whilst I’ve heard stories of some embracing reporting tools, there are still many that don’t. Whilst tools like R and Python are favoured amongst hedge funds for their ability to do proper data science and analyse large scale sets of data, they tend to not be conducive towards facilitating easy data exploration.
As a result, much of the use of these tools has tended to be the sole domain of data scientists and siloed as it has become, has meant that there is a pause in the decision-making workflow when it comes to analysing investments. In many cases, Data Scientists are the ones who have to manually alert Portfolio Managers and Investment Analysts of potential problems.
A reliance on tools like R and Python has been synonymous in the buy side but they suffer from their stop start nature to free flowing workflow. Most data scientist types have used notebooks like those provided by Jupyter (http://jupyter.org/) which allow you to write in R or Python. However, whilst notebooks have been great for individual analysis and for academic purposes, operationalising and sharing these with other users in the organisation has been limited.
When you see how a Jupyter notebook looks in its native mode you can see something that reads like academic text a lot of the time and is not very filterable (i.e. to make changes you need to type them in, not click on drop down selection boxes).
This means that insights that analysts and data scientists come up with using these tools can become non-operational if it gets stuck in those notebooks and is not easily shared with others.
Using tools which can integrate the dashboard/reporting analysis process in combination with tools like Python and R is important if firms want to operationalise their insights.
It’s been through the use of reporting/dashboarding tools that many organisations outside of finance have been able to find ways to make quicker and better decisions.
Many firms can say they avoid these problems by only using simpler analysis tools like Excel and the features of their market information platform (Bloomberg, Morningstar, Refinitiv, Factset etc) but by ignoring the potential insights that come with using larger scale analytics tools these firms can miss out on the opportunity for better returns or improved risk protection.
Not being able to analyse information quickly enough or having systems which are manual can lead towards sub-optimal investment decision making and so the buy side can see improvements thanks to technology too.
Buy side example solution
I mentioned this earlier but Qlik Sense has a means by which Python and R code can be run on their datasets and those reports then become filterable and interactive (unlike the majority of workbooks created in Python/R alone).
The example below shows how Python has been used to create interactivity within a reporting tool like Qlik Sense to allow data science modelling to take place in conjunction with interactive analysis (in buy side’s case with Portfolio Managers and Investment Analysts).
Here's a GitHub link to the work done by Qlik engineer Nabeel Asif where he shows how Python and Qlik can interact (https://github.com/nabeel-oz/qlik-py-tools). In it, there are instructions as well as some example Qlik Sense applications covering data science techniques such as correlations, forecasting and machine learning.
Another way to overcome this lack of interactivity and workflow breakdown is for buy side firms to take advantage of the interactive tools that can be provided within their Python/R notebooks. A good example I’ve recently been studying is called Dash which has been provided by the charting company Plotly (https://plot.ly/).
An example of what Dash can do is shown here (https://dash.plot.ly/gallery) and it’s a far cry from what normal users of R or Python are used to.
In fact, here is a live example of an Oil & Gas dashboard: https://dash-gallery.plotly.host/dash-oil-and-gas
It's clear to see that this is not your normal Jupyter notebook and whilst there are many, many more features that traditional reporting/dashboarding tools can provide you as an organisation, there are good reasons for wanting to stick to solutions like this. Firms with a strong data science/analysis type of focus and whose staff have the necessary coding/programming expertise can easily take advantage of Dash and other tools which aim to help further operationalise the insights you gain from large scale data analysis.
Another situations where tools like Dash can help are in the form of data analysis hubs. Many firms are doing more to display their data online for analysis by 3rd parties (such as the public or paying subscribers). In situations where companies don't need all the features and capabilities provided by traditional reporting/dashboarding tools, then something like Dash can help, especially if they're already using Jupyter notebooks as their go to workspace for data analysis. An example of this is with Quantopian (www.quantopian.com) and how their notebooks area (built to house Python code) could look so much better and become much more intuitive (and user friendly) if it were to employ the capabilities of an add-on like Dash.
For those interested, a great intro to using Dash has been written by Kevin Boller of ‘Towards Data Science’ here on Medium (https://towardsdatascience.com/python-for-finance-dash-by-plotly-ccf84045b8be). In this article he shows how Dash can be added to the already interactive charts provided by Plotly for data analysis and this can help remove the bottleneck that once was the sharing of Jupyter notebooks.
For anyone looking at more details of how to use Dash, there is a course from Udemy by Jose Portilla here (https://www.udemy.com/interactive-python-dashboards-with-plotly-and-dash/ - full disclosure I bought the 9.5 hour video course for AU$18.99).
What else to look out for
Whilst the examples above are some of what I saw on both the buy and sell sides of the professional investing marketplace, there are still many more opportunities to improve workflow processes and thereby create better investment analysis or better investment decisions
In a world reeling from losses across the board in 2018, professionals owe it to their investors to do better in 2019. A big part of this can be guaranteed with further technology adoption.
Tying it back into Druckenmiller’s investment process
Tying this back into Stanley Druckenmiller’s process he mentioned during this talk how he typically invests.
Build a thesis that hopefully no one else has built
Put some positions on
Then when thesis starts to evolve and people get on and you see momentum start to change in your favour then you really go for it and pile into the trade
Bringing in technology to the above process which would work alongside human investors might look something like this
Build a thesis that hopefully no one else has built > Use machine learning to analyse multiple potential market strategies based on multiple thesis that your firm holds
Put some positions on > Backtest these positions across historical periods and build into it variations / regime changes. Create machine led analysis of these and using human oversight, put these trades on.
Then when thesis starts to evolve and people get on and you see momentum start to change in your favour then you really go for it and pile into the trade > Pile into the trade after further machine augmented analysis to ensure that the thesis/strategy still holds true. Invest further once hurdles have been cleared.
Looking at all of this, there is still a lot of opportunity for investment firms to find further ways to differentiate and improve their processes today. It's well worth investigating if they want success in 2019.