aims to examine the conversion rates of the Euro to Dollar historically through a data set of Euro conversion rates across multiple countries from the Euro's conception to the beginning of 2021. Then trends across six economically impactful events will be visually represented to see what kind of relationship exists betweent these events and the Euro(€) to Dollar($) exchange rate.
import pandas as pd
exchange_rates = pd.read_csv('euro-daily-hist_1999_2020.csv')
exchange_rates.head()
Period\Unit: | [Australian dollar ] | [Bulgarian lev ] | [Brazilian real ] | [Canadian dollar ] | [Swiss franc ] | [Chinese yuan renminbi ] | [Cypriot pound ] | [Czech koruna ] | [Danish krone ] | ... | [Romanian leu ] | [Russian rouble ] | [Swedish krona ] | [Singapore dollar ] | [Slovenian tolar ] | [Slovak koruna ] | [Thai baht ] | [Turkish lira ] | [US dollar ] | [South African rand ] | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 2021-01-08 | 1.5758 | 1.9558 | 6.5748 | 1.5543 | 1.0827 | 7.9184 | NaN | 26.163 | 7.4369 | ... | 4.8708 | 90.8000 | 10.0510 | 1.6228 | NaN | NaN | 36.8480 | 9.0146 | 1.2250 | 18.7212 |
1 | 2021-01-07 | 1.5836 | 1.9558 | 6.5172 | 1.5601 | 1.0833 | 7.9392 | NaN | 26.147 | 7.4392 | ... | 4.8712 | 91.2000 | 10.0575 | 1.6253 | NaN | NaN | 36.8590 | 8.9987 | 1.2276 | 18.7919 |
2 | 2021-01-06 | 1.5824 | 1.9558 | 6.5119 | 1.5640 | 1.0821 | 7.9653 | NaN | 26.145 | 7.4393 | ... | 4.8720 | 90.8175 | 10.0653 | 1.6246 | NaN | NaN | 36.9210 | 9.0554 | 1.2338 | 18.5123 |
3 | 2021-01-05 | 1.5927 | 1.9558 | 6.5517 | 1.5651 | 1.0803 | 7.9315 | NaN | 26.227 | 7.4387 | ... | 4.8721 | 91.6715 | 10.0570 | 1.6180 | NaN | NaN | 36.7760 | 9.0694 | 1.2271 | 18.4194 |
4 | 2021-01-04 | 1.5928 | 1.9558 | 6.3241 | 1.5621 | 1.0811 | 7.9484 | NaN | 26.141 | 7.4379 | ... | 4.8713 | 90.3420 | 10.0895 | 1.6198 | NaN | NaN | 36.7280 | 9.0579 | 1.2296 | 17.9214 |
5 rows × 41 columns
exchange_rates.tail()
Period\Unit: | [Australian dollar ] | [Bulgarian lev ] | [Brazilian real ] | [Canadian dollar ] | [Swiss franc ] | [Chinese yuan renminbi ] | [Cypriot pound ] | [Czech koruna ] | [Danish krone ] | ... | [Romanian leu ] | [Russian rouble ] | [Swedish krona ] | [Singapore dollar ] | [Slovenian tolar ] | [Slovak koruna ] | [Thai baht ] | [Turkish lira ] | [US dollar ] | [South African rand ] | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5694 | 1999-01-08 | 1.8406 | NaN | NaN | 1.7643 | 1.6138 | NaN | 0.58187 | 34.938 | 7.4433 | ... | 1.3143 | 27.2075 | 9.1650 | 1.9537 | 188.8400 | 42.560 | 42.5590 | 0.3718 | 1.1659 | 6.7855 |
5695 | 1999-01-07 | 1.8474 | NaN | NaN | 1.7602 | 1.6165 | NaN | 0.58187 | 34.886 | 7.4431 | ... | 1.3092 | 26.9876 | 9.1800 | 1.9436 | 188.8000 | 42.765 | 42.1678 | 0.3701 | 1.1632 | 6.8283 |
5696 | 1999-01-06 | 1.8820 | NaN | NaN | 1.7711 | 1.6116 | NaN | 0.58200 | 34.850 | 7.4452 | ... | 1.3168 | 27.4315 | 9.3050 | 1.9699 | 188.7000 | 42.778 | 42.6949 | 0.3722 | 1.1743 | 6.7307 |
5697 | 1999-01-05 | 1.8944 | NaN | NaN | 1.7965 | 1.6123 | NaN | 0.58230 | 34.917 | 7.4495 | ... | 1.3168 | 26.5876 | 9.4025 | 1.9655 | 188.7750 | 42.848 | 42.5048 | 0.3728 | 1.1790 | 6.7975 |
5698 | 1999-01-04 | 1.9100 | NaN | NaN | 1.8004 | 1.6168 | NaN | 0.58231 | 35.107 | 7.4501 | ... | 1.3111 | 25.2875 | 9.4696 | 1.9554 | 189.0450 | 42.991 | 42.6799 | 0.3723 | 1.1789 | 6.9358 |
5 rows × 41 columns
NaN
. Currencies like the Cypriot, Slovenian tolar, and Slovak koruna have all been replaced by the euro at the end of the first decade of the new millenium.exchange_rates.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 5699 entries, 0 to 5698 Data columns (total 41 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Period\Unit: 5699 non-null object 1 [Australian dollar ] 5699 non-null object 2 [Bulgarian lev ] 5297 non-null object 3 [Brazilian real ] 5431 non-null object 4 [Canadian dollar ] 5699 non-null object 5 [Swiss franc ] 5699 non-null object 6 [Chinese yuan renminbi ] 5431 non-null object 7 [Cypriot pound ] 2346 non-null object 8 [Czech koruna ] 5699 non-null object 9 [Danish krone ] 5699 non-null object 10 [Estonian kroon ] 3130 non-null object 11 [UK pound sterling ] 5699 non-null object 12 [Greek drachma ] 520 non-null object 13 [Hong Kong dollar ] 5699 non-null object 14 [Croatian kuna ] 5431 non-null object 15 [Hungarian forint ] 5699 non-null object 16 [Indonesian rupiah ] 5699 non-null object 17 [Israeli shekel ] 5431 non-null object 18 [Indian rupee ] 5431 non-null object 19 [Iceland krona ] 3292 non-null float64 20 [Japanese yen ] 5699 non-null object 21 [Korean won ] 5699 non-null object 22 [Lithuanian litas ] 4159 non-null object 23 [Latvian lats ] 3904 non-null object 24 [Maltese lira ] 2346 non-null object 25 [Mexican peso ] 5699 non-null object 26 [Malaysian ringgit ] 5699 non-null object 27 [Norwegian krone ] 5699 non-null object 28 [New Zealand dollar ] 5699 non-null object 29 [Philippine peso ] 5699 non-null object 30 [Polish zloty ] 5699 non-null object 31 [Romanian leu ] 5637 non-null float64 32 [Russian rouble ] 5699 non-null object 33 [Swedish krona ] 5699 non-null object 34 [Singapore dollar ] 5699 non-null object 35 [Slovenian tolar ] 2085 non-null object 36 [Slovak koruna ] 2608 non-null object 37 [Thai baht ] 5699 non-null object 38 [Turkish lira ] 5637 non-null float64 39 [US dollar ] 5699 non-null object 40 [South African rand ] 5699 non-null object dtypes: float64(3), object(38) memory usage: 1.8+ MB
shows 5,699 rows and 41 columns total. There are 3 columns which are float while the rest are object types.
exchange_rates.rename(columns={'[US dollar ]': 'US_dollar',
'Period\\Unit:': 'Time'},
inplace=True)
exchange_rates['Time'] = pd.to_datetime(exchange_rates['Time'])
exchange_rates.sort_values('Time', inplace=True)
exchange_rates.reset_index(drop=True, inplace=True)
euro_to_dollar = exchange_rates[['Time',"US_dollar"]]
exchange_rates["US_dollar"].value_counts()
- 62 1.2276 9 1.1215 8 1.1305 7 1.1797 6 .. 0.9007 1 1.3386 1 1.3056 1 1.0487 1 1.2405 1 Name: US_dollar, Length: 3528, dtype: int64
There is 3,528 values, but there should be 5,699 values. Some values are missing. Also, there are 62 '-' characters which need to be dropped.
euro_to_dollar = euro_to_dollar[euro_to_dollar["US_dollar"] != '-']
euro_to_dollar['US_dollar'] = euro_to_dollar['US_dollar'].astype(float)
euro_to_dollar.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 5637 entries, 0 to 5698 Data columns (total 2 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Time 5637 non-null datetime64[ns] 1 US_dollar 5637 non-null float64 dtypes: datetime64[ns](1), float64(1) memory usage: 132.1 KB
import matplotlib.pyplot as plt
%matplotlib inline
plt.plot(euro_to_dollar["Time"],
euro_to_dollar["US_dollar"])
plt.show()
is a graph of the Euro-to-Dollar exchange rate over time. Below will be examined how a rolling mean (average over a specified period of time for a given day) will affect the smoothness of the graph.
rolling_mean = euro_to_dollar['US_dollar'].rolling(30).mean()
plt.plot(euro_to_dollar["Time"],
rolling_mean)
plt.show()
has improved in its smoothness and will be used for further data visualizations.
made up of seven distinct instances in time: The date the Euro originated, the September 11 World Trade Center attacks, Hurricane Katrina, the unofficial start of the housing crisis, the inital voting in that would lead to Brexit leaving the EU, America's recognition of Covid leading to lockdown orders as well as soaring COVID-19 cases, and the most recent date of the current Dollar value of the Euro.
Origin_to_Sep11 = euro_to_dollar[(euro_to_dollar['Time'] >= '1999-01-04') & (euro_to_dollar['Time'] <= '2001-09-11')]
Sep11_to_Katrina = euro_to_dollar[(euro_to_dollar['Time'] >= '2001-09-11') & (euro_to_dollar['Time'] <= '2005-08-29')]
Katrina_to_Housing_crisis = euro_to_dollar[(euro_to_dollar['Time'] >= '2005-08-29') & (euro_to_dollar['Time'] <= '2008-12-30')]
Housing_crisis_to_Brexit = euro_to_dollar[(euro_to_dollar['Time'] >= '2008-12-30') & (euro_to_dollar['Time'] <= '2016-06-23')] ## 'Japan Triple Disaster'–Earthquake,Tsunami,Radioactive Leak
Brexit_to_Covid_start = euro_to_dollar[(euro_to_dollar['Time'] >= '2016-06-23') & (euro_to_dollar['Time'] <= '2020-03-09')]
Covid_start_early_2021 = euro_to_dollar[(euro_to_dollar['Time'] >= '2020-03-09') & (euro_to_dollar['Time'] <= '2021-01-08')]
Origin_Sep11 = Origin_to_Sep11["US_dollar"].rolling(30).mean()
import matplotlib.style as style
style.use('fivethirtyeight')
import datetime as dt
from matplotlib.dates import DateFormatter
import matplotlib.dates as mdates
import datetime
fig = plt.figure(figsize=(30,20))
#1ST GRAPH
ax1 = fig.add_subplot(271)
ax1.plot(euro_to_dollar['Time'],rolling_mean,
color='#dcb56e', linewidth=2.5)
plt.xlim([datetime.date(1999, 1, 4), datetime.date(2001, 9, 11)])
ax1.xaxis.set_major_locator(mdates.MonthLocator(interval=4))
ax1.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
plt.ylim([.8,1.6])
ax1.text(datetime.date(1999, 4, 4), 1.7,
s = 'Invention to 911', color='#2d2de5', fontsize = 24)
#2ND GRAPH
ax2 = fig.add_subplot(272)
ax2.plot(euro_to_dollar['Time'],rolling_mean, color='#2d2de5', linewidth=2.5)
plt.xlim([datetime.date(2001, 9, 11), datetime.date(2005, 8, 29)])
ax2.xaxis.set_major_locator(mdates.MonthLocator(interval=4))
ax2.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
plt.ylim([.8,1.6])
ax2.text(datetime.date(2002, 1, 11), 1.7,
s = '911 to Katrina', color='#2d2de5', fontsize = 24)
#3RD GRAPH
ax3 = fig.add_subplot(273)
ax3.plot(euro_to_dollar['Time'],rolling_mean,
color='#bb00ff', linewidth=2.5)
plt.xlim([datetime.date(2005, 8, 29), datetime.date(2008, 12, 30)])
ax3.xaxis.set_major_locator(mdates.MonthLocator(interval=4))
ax3.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
plt.ylim([.8,1.6])
ax3.set_yticklabels([])
ax3.text(datetime.date(2005, 4, 29), 1.7,
s = 'Katrina to Housing Crisis', color='#bb00ff', fontsize = 24)
ax3.annotate('max', (dt.datetime(2008,7,15),1.599), xytext=(10, -10),
textcoords='offset points')
#4TH GRAPH
ax4 = fig.add_subplot(2,7,(4,5))
ax4.plot(euro_to_dollar['Time'],rolling_mean,
color='#ffa500', linewidth=2.5)
plt.xlim([datetime.date(2008, 12, 30), datetime.date(2016, 6, 23)])
ax4.xaxis.set_major_locator(mdates.MonthLocator(interval=4))
ax4.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
plt.ylim([.8,1.6])
ax4.text(datetime.date(2009, 12, 30), 1.7,
s = 'Housing Crisis to Brexit Upvote', color='#ffa500', fontsize = 24)
#5TH GRAPH
ax5 = fig.add_subplot(2,7,6)
ax5.plot(euro_to_dollar['Time'],rolling_mean, color='#9ed670', linewidth=2.5)
plt.xlim([datetime.date(2016, 6, 23), datetime.date(2020, 3, 9)])
ax5.xaxis.set_major_locator(mdates.MonthLocator(interval=4))
ax5.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
plt.ylim([.8,1.6])
ax5.text(datetime.date(2016, 11, 23), 1.7,
s = 'Brexit Upvote', color='#9ed670', fontsize = 24)
ax5.text(datetime.date(2016, 11, 23), 1.65,
s = 'to Coronavirus', color='#9ed670', fontsize = 24)
#6TH GRAPH
ax6 = fig.add_subplot(2,7,7)
ax6.plot(euro_to_dollar['Time'],rolling_mean,
color='#24778c', linewidth=2.5)
plt.xlim([datetime.date(2020, 3, 9), datetime.date(2021, 1, 8)])
ax6.xaxis.set_major_locator(mdates.MonthLocator(interval=1))
ax6.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
plt.xticks(rotation = 90)
ax6.set_yticklabels([])
plt.ylim([.8,1.6])
ax6.text(datetime.date(2020, 2, 9), 1.7,
s = 'Coronavirus to Current', color='#24778c', fontsize = 24)
axes = [ax2,ax3,ax4,ax5,ax6]
for ax in axes:
ax.set_yticklabels(labels=[])
#1ST INTERVAL
ax7 = fig.add_subplot(212)
ax7.plot(Origin_to_Sep11['Time'],Origin_to_Sep11['US_dollar'].rolling(30).mean(),
color='#dcb56e', linewidth=10.0, solid_capstyle = 'round')
plt.xticks(rotation = 90)
plt.xlim([datetime.date(1999, 1, 4), datetime.date(2021, 4, 8)])
ax7.xaxis.set_major_locator(mdates.MonthLocator(interval=12))
ax7.xaxis.set_major_formatter(DateFormatter("%Y-%m"))
#2ND INTERVAL
ax7.plot(Sep11_to_Katrina['Time'],Sep11_to_Katrina['US_dollar'].rolling(30).mean(),
color='#2d2de5', linewidth=10.0, solid_capstyle = 'round',alpha = 0.4)
#3RD INTERVAL
ax7.plot(Katrina_to_Housing_crisis['Time'],Katrina_to_Housing_crisis['US_dollar'].rolling(30).mean(),
color='#bb00ff', linewidth=10.0, solid_capstyle = 'round',alpha = 0.4)
#4TH INTERVAL
ax7.plot(Housing_crisis_to_Brexit['Time'],Housing_crisis_to_Brexit['US_dollar'].rolling(30).mean(),
color='#ffa500', linewidth=10.0, solid_capstyle = 'round')
#5TH INTERVAL
ax7.plot(Brexit_to_Covid_start['Time'],Brexit_to_Covid_start['US_dollar'].rolling(30).mean(),
color='#9ed670', linewidth=10.0, solid_capstyle = 'round')
#6TH INTERVAL
ax7.plot(Covid_start_early_2021['Time'],Covid_start_early_2021['US_dollar'].rolling(30).mean(),
color='#24778c', linewidth=10.0, solid_capstyle = 'round',alpha = 0.4)
ax7.plot(euro_to_dollar['Time'],rolling_mean,
color='#24778c', linewidth=4.0, solid_capstyle = 'round',alpha=0.4)
ax7.grid(False)
ax7.grid(axis='y')
pos = [ datetime.date(1999, 1, 4),datetime.date(2001, 9, 11),datetime.date(2005, 8, 29),
datetime.date(2008, 12, 30),datetime.date(2016, 6, 23),datetime.date(2020, 3, 9)]
lab = [ '1999-01-04', '2001-09-11','2005-08-29', '2008-12-30','2106-6-23', '2020-03-09']
plt.xticks(pos, lab)
import matplotlib.dates as mdates
ax7.axvspan(*mdates.datestr2num(['4/1/2008', '9/1/2008']), color='grey', alpha=0.4)
plt.show()
with the top row being broken up into six different graphs. These graphs are not equal time periods but instead, divided by notable economic events and their corresponding dates. The graph below consolidates the graphs above and are color-coded to give a sense for where and how much space each interval takes over the entire time the euro has been in existence. There are some important aspects to consider based off the graph:
point_a_index_max = Origin_to_Sep11['US_dollar'].idxmax()
point_a_conversion_max = Origin_to_Sep11['US_dollar'].max()
point_b_index_max = Sep11_to_Katrina['US_dollar'].idxmax()
point_b_conversion_max = Sep11_to_Katrina['US_dollar'].max()
point_c_index_max = Katrina_to_Housing_crisis['US_dollar'].idxmax()
point_c_conversion_max = Katrina_to_Housing_crisis['US_dollar'].max()
point_d_index_max = Housing_crisis_to_Brexit['US_dollar'].idxmax()
point_d_conversion_max = Housing_crisis_to_Brexit['US_dollar'].max()
point_e_index_max = Brexit_to_Covid_start['US_dollar'].idxmax()
point_e_conversion_max = Brexit_to_Covid_start['US_dollar'].max()
point_f_index_max = Covid_start_early_2021['US_dollar'].idxmax()
point_f_conversion_max = Covid_start_early_2021['US_dollar'].max()
index_max = [point_a_index_max, point_b_index_max, point_c_index_max,
point_d_index_max, point_e_index_max, point_f_index_max]
conversion_max = [point_a_conversion_max, point_b_conversion_max, point_c_conversion_max,
point_d_conversion_max, point_e_conversion_max, point_f_conversion_max]
era_max_conversions = {}
for index, conversion in zip(index_max, conversion_max):
era_max_conversions[euro_to_dollar['Time'][index].strftime("%Y/%m/%d")] = conversion
for key, value in era_max_conversions.items():
print(key, ':', value)
point_d_index_min = Housing_crisis_to_Brexit['US_dollar'].idxmin()
point_d_conversion_min = Housing_crisis_to_Brexit['US_dollar'].min()
Fourth_period_min = euro_to_dollar['Time'][point_d_index_min].strftime("%Y/%m/%d")
print('The fourth period minimum is on', value = Fourth_period_min)
1999/01/05 : 1.179 2004/12/28 : 1.3633 2008/07/15 : 1.599 2009/12/03 : 1.512 2018/02/15 : 1.2493 2021/01/06 : 1.2338 The fourth period minimum is 2015/04/13
for the Euro in each time period are given above. The last two time periods have nearly identical conversion rates. It's interesting to note that the Euro conversion rate peaked in value at the onslaught of the housing crisis. As America slipped into the great recession in late 2008 and 2009, we begin seeing the Euro conversion go through a lot of volatility on a downward trend where it has now settled to a conversion value only slightly above when it first originated. The Dollar conversion to Euro has experience a lot of maxima and minima. It would be interesting to see analysis looking at if there were correlations economically or even politically with maxima and minima, if they exist.