The following problem was posted on the Julia Discourse:

So I have three magnetic resonance (MR) images: $X, Y, Z$. They come from three different pulse sequences which interact with NMR relaxation parameters ) in known ways. That is, we have equations which given the underlying NMR parameters (specifically, T1, T2 and PD) and the scanner parameters $p^X, p^Y, p^Z$ corresponding to each image $X,Y,Z$, we can model the output intensity value observed in the acquired image at some voxel [$(i, j, k)$, e.g., $X_{i,j,k}$, since we are in 3D].

However, we don’t know the $T_1$, $T_2$ and $PD$ parameters but we do have $X,Y,Z$ and we can separately solve for each $p^{(\cdot)}$. If you haven’t followed any of the previous mediocre explanation, then the problem simply reduces to the following system of equations, for which I am trying to solve $T_1$, $T_2$, and $P_D$ where everything else is given.

$$ \begin{align} \log(X_{i,j,k}) &= \log(PD) + p_0^X + p_1^X T_1 + p_2^X T_1^2 \\ \log(Y_{i,j,k}) &= \log(PD) + p_0^Y + p_1^Y T_1 + p_2^Y / T_2 \\ \log(Z_{i,j,k}) &= \log(PD) + p_0^Z + p_1^Z T_1 + p_2^Z / T_2 \\ \end{align} $$In previous iterations of this project, the person who built the code used non-linear least squares (in MATLAB, FWIW) to solve for the values of $T_1$, $T_2$ and $PD$. However, the person who built the code left our lab, and with that has gone the knowledge of why that was chosen.

Previously I was using python’s scipy implementation of least squares to solve the problem at every voxel; however, even with parallelization this

takes hours.

Since these are polynomial equations we can use homotopy continuation to solve the problem. Let's setup the system first:

In [13]:

```
using HomotopyContinuation, DynamicPolynomials
@polyvar x y z PD p[1:3,1:3] T[1:2]
f1 = PD + p[1,1] + p[1,2]*T[1] + p[1,3] * T[1]^2 - x
# note that we eliminate the denominator in equation 2 and 3
f2 = (PD + p[2,1] + p[2,2]*T[1])*T[2] + p[2,3] - y * T[2]
f3 = (PD + p[3,1] + p[3,2]*T[1])*T[2] + p[3,3] - z * T[2]
f = [f1, f2, f3]
```

Out[13]:

Since we need to solve the same instance many many times it makes sense to first compute the generic number of solutions of this polynomial system and then to use a parameter homotopy.

We need some generic points to generate a generic instance

In [50]:

```
params = [vec(p); x; y; z]
generic_params = randn(ComplexF64, length(params)) # Note we should sample them randomly
f_generic = subs.(f, Ref(params => generic_params))
```

Out[50]:

In [53]:

```
generic_result = solve(f_generic)
```

Out[53]:

We see that the system has only 2 solutions (instead the maximal possible 8 solutions).

In [54]:

```
generic_solutions = solutions(generic_result)
```

Out[54]:

Now let's prepare to solve our actual problems:

In [61]:

```
# Construct a path tracker to track the two solutions to our specific instances
# p₁ and
tracker = pathtracker(f, generic_solutions;
parameters=params,
# we just use som dummy parameters for now
targetparameters=generic_params,
startparameters=generic_params,
affine=true)
```

Out[61]:

Out of lack of real world data let's sample some random parameters again

In [56]:

```
specific_params = randn(length(params));
```

Note that the parameters order is

In [57]:

```
params
```

Out[57]:

In [78]:

```
# Setup the new parameters
set_parameters!(tracker.homotopy, generic_params, specific_params);
# track to the two solutions
r1 = track(tracker, generic_solutions[1])
r2 = track(tracker, generic_solutions[2])
```

Out[78]:

In [88]:

```
# Lets check that the tracking was successfull for both
r1.returncode == PathTrackerStatus.success && r2.returncode == PathTrackerStatus.success
```

Out[88]:

And we get our two specific solutions:

In [86]:

```
r1.x
```

Out[86]:

In [87]:

```
r2.x
```

Out[87]:

We also see that these are complex vectors, but we are interested in the real solutions.

In [93]:

```
if maximum(abs ∘ imag, r1.x) < 1e-8
println("First solution: ", real.(r1.x))
end
if maximum(abs ∘ imag, r2.x) < 1e-8
println("Second solution: ", real.(r2.x))
end
```