The change of the standard Gibbs function $\Delta G$ of a reaction was determined by measuring equilibrium constants at various temperatures. The following values were obtained:

$T$ [K] | $\Delta G$ [kJ $\mathrm{mol}^{-1}$] |
---|---|

270 | 40.3 |

280 | 38.2 |

290 | 36.1 |

300 | 32.2 |

310 | 29.1 |

320 | 28.0 |

330 | 25.3 |

The uncertainty in $T$ is negligible and the weight factors are equal for all cases.

- Determine the reaction entropy $S = −d \Delta G/dT$ by fitting the values of $\Delta G$ to a linear function of $T$.
- What is the uncertainty in $S$ to a 90% confidence?
- Extrapolate $\Delta G$ to $T = 350 \mathrm{K}$ and give the uncertainty at 90% confidence level. Compare with the uncertainty at 90% for $\Delta G$ at $T = 290 \mathrm{K}$. Discuss the differences in the uncertainties and what can you conclude?
- Import the seaborn library, use the regplot function to plot the data. Try 90% and 99% confidence intervals when plotting. Compare and conclude.

In [2]:

```
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
from scipy import stats
import seaborn as sns
T = np.array([270, 280, 290, 300, 310, 320, 330])
x = T
DeltaG = np.array([40.3, 38.2, 36.1, 32.2, 29.1, 28.0, 25.3])
y = DeltaG
```