I will briefly describe:
Trend trading can be defined as:
... a trading strategy according to which one should buy an asset when its price trend goes up, and sell when its trend goes down, expecting price movements to continue.
Again borrowing from Wikipedia, backtesting:
... seeks to estimate the performance of a strategy or model if it had been employed during a past period. This requires simulating past conditions with sufficient detail, making one limitation of backtesting the need for detailed historical data.
%run modules_algo_trading_v4.ipynb
yahoo
or Quandl
¶start, end = datetime.datetime(2000, 1, 1), datetime.datetime(2018, 1, 1)
apple = pdr.get_data_yahoo('AAPL', start=start, end=end)
apple.head()
Open | High | Low | Close | Adj Close | Volume | |
---|---|---|---|---|---|---|
Date | ||||||
2000-01-03 | 3.745536 | 4.017857 | 3.631696 | 3.997768 | 3.543045 | 133949200 |
2000-01-04 | 3.866071 | 3.950893 | 3.613839 | 3.660714 | 3.244329 | 128094400 |
2000-01-05 | 3.705357 | 3.948661 | 3.678571 | 3.714286 | 3.291807 | 194580400 |
2000-01-06 | 3.790179 | 3.821429 | 3.392857 | 3.392857 | 3.006939 | 191993200 |
2000-01-07 | 3.446429 | 3.607143 | 3.410714 | 3.553571 | 3.149374 | 115183600 |
apple.isnull().any().unique()
array([False])
start, end ="2000-01-01", "2018-01-01"
apple = quandl.get("WIKI/AAPL", start_date=start, end_date=end)
apple.columns = [el.lower().replace('.', '').replace(' ', '_') for el in apple.columns]
apple.head()
open | high | low | close | volume | ex-dividend | split_ratio | adj_open | adj_high | adj_low | adj_close | adj_volume | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Date | ||||||||||||
2000-01-03 | 104.87 | 112.50 | 101.69 | 111.94 | 4783900.0 | 0.0 | 1.0 | 3.369314 | 3.614454 | 3.267146 | 3.596463 | 133949200.0 |
2000-01-04 | 108.25 | 110.62 | 101.19 | 102.50 | 4574800.0 | 0.0 | 1.0 | 3.477908 | 3.554053 | 3.251081 | 3.293170 | 128094400.0 |
2000-01-05 | 103.75 | 110.56 | 103.00 | 104.00 | 6949300.0 | 0.0 | 1.0 | 3.333330 | 3.552125 | 3.309234 | 3.341362 | 194580400.0 |
2000-01-06 | 106.12 | 107.00 | 95.00 | 95.00 | 6856900.0 | 0.0 | 1.0 | 3.409475 | 3.437748 | 3.052206 | 3.052206 | 191993200.0 |
2000-01-07 | 96.50 | 101.00 | 95.50 | 99.50 | 4113700.0 | 0.0 | 1.0 | 3.100399 | 3.244977 | 3.068270 | 3.196784 | 115183600.0 |
plt.rcParams['figure.figsize'] = 16, 8
apple['close'].plot(grid=False, rot=90);
plt.show();
import fix_yahoo_finance as yf
yf.pdr_override()
The strategy consists in:
smw
and long moving window lmw
DataFrame
for signals, called sig
sig
with the SMA and LMAsug
column (starting from row smw
), inserting ones when the value of column short_ma
is larger than long_ma
smw, lmw = 40, 100
sig = pd.DataFrame(index=apple.index, columns = ['short_ma','long_ma' ])
sig['signal'], sig['short_ma'], sig['long_ma'] = 0.0, 0.0, 0.0
sig['short_ma'] = apple['close'].rolling(window=smw,
min_periods=1,
center=False).mean()
sig['long_ma'] = apple['close'].rolling(window=lmw,
min_periods=1,
center=False).mean()
sig['signal'][smw:] = np.where(sig['short_ma'][smw:]
> sig['long_ma'][smw:], 1.0, 0.0)
sig['pos'] = sig['signal'].diff()
sig
short_ma | long_ma | signal | pos | |
---|---|---|---|---|
Date | ||||
2000-01-03 | 111.940000 | 111.940000 | 0.0 | NaN |
2000-01-04 | 107.220000 | 107.220000 | 0.0 | 0.0 |
2000-01-05 | 106.146667 | 106.146667 | 0.0 | 0.0 |
2000-01-06 | 103.360000 | 103.360000 | 0.0 | 0.0 |
2000-01-07 | 102.588000 | 102.588000 | 0.0 | 0.0 |
2000-01-10 | 101.781667 | 101.781667 | 0.0 | 0.0 |
2000-01-11 | 100.491429 | 100.491429 | 0.0 | 0.0 |
2000-01-12 | 98.828750 | 98.828750 | 0.0 | 0.0 |
2000-01-13 | 98.597778 | 98.597778 | 0.0 | 0.0 |
2000-01-14 | 98.782000 | 98.782000 | 0.0 | 0.0 |
2000-01-18 | 99.250909 | 99.250909 | 0.0 | 0.0 |
2000-01-19 | 99.860000 | 99.860000 | 0.0 | 0.0 |
2000-01-20 | 100.909231 | 100.909231 | 0.0 | 0.0 |
2000-01-21 | 101.652143 | 101.652143 | 0.0 | 0.0 |
2000-01-24 | 101.958667 | 101.958667 | 0.0 | 0.0 |
2000-01-25 | 102.601875 | 102.601875 | 0.0 | 0.0 |
2000-01-26 | 103.048235 | 103.048235 | 0.0 | 0.0 |
2000-01-27 | 103.434444 | 103.434444 | 0.0 | 0.0 |
2000-01-28 | 103.338947 | 103.338947 | 0.0 | 0.0 |
2000-01-31 | 103.359500 | 103.359500 | 0.0 | 0.0 |
2000-02-01 | 103.211429 | 103.211429 | 0.0 | 0.0 |
2000-02-02 | 103.011364 | 103.011364 | 0.0 | 0.0 |
2000-02-03 | 103.024348 | 103.024348 | 0.0 | 0.0 |
2000-02-04 | 103.231667 | 103.231667 | 0.0 | 0.0 |
2000-02-07 | 103.664800 | 103.664800 | 0.0 | 0.0 |
2000-02-08 | 104.095769 | 104.095769 | 0.0 | 0.0 |
2000-02-09 | 104.411481 | 104.411481 | 0.0 | 0.0 |
2000-02-10 | 104.736071 | 104.736071 | 0.0 | 0.0 |
2000-02-11 | 104.874483 | 104.874483 | 0.0 | 0.0 |
2000-02-14 | 105.239000 | 105.239000 | 0.0 | 0.0 |
... | ... | ... | ... | ... |
2017-11-16 | 160.876395 | 157.044258 | 1.0 | 0.0 |
2017-11-17 | 161.295395 | 157.287558 | 1.0 | 0.0 |
2017-11-20 | 161.747645 | 157.549958 | 1.0 | 0.0 |
2017-11-21 | 162.312395 | 157.823058 | 1.0 | 0.0 |
2017-11-22 | 162.857895 | 158.135858 | 1.0 | 0.0 |
2017-11-24 | 163.376395 | 158.445358 | 1.0 | 0.0 |
2017-11-27 | 163.896645 | 158.751258 | 1.0 | 0.0 |
2017-11-28 | 164.370395 | 159.041058 | 1.0 | 0.0 |
2017-11-29 | 164.762145 | 159.308558 | 1.0 | 0.0 |
2017-11-30 | 165.196395 | 159.585258 | 1.0 | 0.0 |
2017-12-01 | 165.636375 | 159.845158 | 1.0 | 0.0 |
2017-12-04 | 165.996625 | 160.087858 | 1.0 | 0.0 |
2017-12-05 | 166.355125 | 160.326858 | 1.0 | 0.0 |
2017-12-06 | 166.684375 | 160.539258 | 1.0 | 0.0 |
2017-12-07 | 167.023175 | 160.743378 | 1.0 | 0.0 |
2017-12-08 | 167.343675 | 160.941478 | 1.0 | 0.0 |
2017-12-11 | 167.760425 | 161.167378 | 1.0 | 0.0 |
2017-12-12 | 168.128175 | 161.374178 | 1.0 | 0.0 |
2017-12-13 | 168.437925 | 161.593478 | 1.0 | 0.0 |
2017-12-14 | 168.731675 | 161.812978 | 1.0 | 0.0 |
2017-12-15 | 169.084425 | 162.030778 | 1.0 | 0.0 |
2017-12-18 | 169.595425 | 162.267578 | 1.0 | 0.0 |
2017-12-19 | 170.054925 | 162.478378 | 1.0 | 0.0 |
2017-12-20 | 170.509425 | 162.716278 | 1.0 | 0.0 |
2017-12-21 | 170.957175 | 162.971378 | 1.0 | 0.0 |
2017-12-22 | 171.422300 | 163.232978 | 1.0 | 0.0 |
2017-12-26 | 171.751300 | 163.438178 | 1.0 | 0.0 |
2017-12-27 | 171.940050 | 163.572778 | 1.0 | 0.0 |
2017-12-28 | 172.049050 | 163.727878 | 1.0 | 0.0 |
2017-12-29 | 172.053800 | 163.856278 | 1.0 | 0.0 |
4526 rows × 4 columns
ylabel, col, cols_ma ='price', 'close', ['short_ma', 'long_ma']
afa.plot_function(apple, sig, ylabel, col, cols_ma)
Steps:
holdings
, cash
, total
and r
to portfolioinitial_capital, N = 2000000.0, 200
pos = pd.DataFrame(index=sig.index).fillna(0.0)
pos['AAPL'] = 100*sig['signal']
ptf = pos.multiply(apple['adj_close'], axis=0)
pos_diff = pos.diff()
ptf['hds'] = (pos.multiply(apple['adj_close'], axis=0)).sum(axis=1)
ptf['cash'] = initial_capital - (pos_diff.multiply(apple['adj_close'], axis=0)).sum(axis=1).cumsum()
ptf['tot'] = ptf['cash'] + ptf['hds']
ptf['r'] = ptf['tot'].pct_change()
ptf.head()
AAPL | hds | cash | tot | r | |
---|---|---|---|---|---|
Date | |||||
2000-01-03 | 0.0 | 0.0 | 2000000.0 | 2000000.0 | NaN |
2000-01-04 | 0.0 | 0.0 | 2000000.0 | 2000000.0 | 0.0 |
2000-01-05 | 0.0 | 0.0 | 2000000.0 | 2000000.0 | 0.0 |
2000-01-06 | 0.0 | 0.0 | 2000000.0 | 2000000.0 | 0.0 |
2000-01-07 | 0.0 | 0.0 | 2000000.0 | 2000000.0 | 0.0 |
df, ylabel, col = ptf, 'portfolio value', 'tot'
afa.plot_function_2(df, sig, ylabel, col)
returns = ptf['r']
sharpe_ratio = np.sqrt(252)*(returns.mean() / returns.std())
print('sharpe_ratio is:',round(sharpe_ratio, 3))
sharpe_ratio is: 0.543
window = 252
rolling_max = apple['adj_close'].rolling(window, min_periods=1).max()
daily_drawdown = (apple['adj_close']/rolling_max - 1.0)
max_daily_drawdown = daily_drawdown.rolling(window, min_periods=1).min()
daily_drawdown.plot()
max_daily_drawdown.plot()
plt.show();