Friday, June 4, 2021

Price increase from October 2021

In February 2020 we announced a price increase, which was later postponed due to the COVID-19 events.

This price increase will now come into effect from October 1st, 2021

Prices go up 5.4% on average: the basic PTS and PTON floating license prices increase by 100 USD each, and the remaining values are adjusted appropriately, following our common rule that a server license is worth 4 times the floating license and that annual maintenance for each part is 25% of the base price for the part. 

New prices can be found at the top of

The current prices remained unchanged since 2015.

Tuesday, April 20, 2021

Data-driven distributionally robust optimization with MOSEK

We posted a video where our very own Utkarsh Detha accompanied by the authors of the award-winning paper  "Data-driven distributionally robust optimization using the Wasserstein metric: performance guarantees and tractable reformulations", Assistant Prof. Peyman Mohajerin Esfahani and Prof. Daniel Kuhn, discuss the work presented in that paper.

We also show how to quickly and easily implement such an approach using our Fusion API for Python. This video focuses on the key new feature: parameters in Fusion. Parametric Fusion allows MOSEK to rapidly resolve a model, and combined with the warm-start capability of the Simplex optimizer, it becomes a powerful tool in every optimizer's garage.

See: the video, our notebook on the same topic, the research paper.

Monday, December 14, 2020

MOSEK in the browser and OptServer

Although both installing MOSEK and obtaining a trial license are very easy, it is now possible to get to know MOSEK directly in the browser with one click and no setup at all.


With a Google account you can run MOSEK in Google Colab notebooks. Click below to open the sample MOSEK Colab notebook in your user space and run it.

(You can also just download the notebook and run it in Jupyter or your favorite environment, but that's more than one click). Since this is a complete MOSEK installation in Python, the full Optimizer and Fusion APIs are available.

Javascript Fusion

The Fusion API is experimentally available in Javascript. Just click below to open the editor in the browser and start coding and solving optimization problems in our Fusion API.

An almost complete Fusion API is mapped through and you can run a number of ready examples.

How does it work?

The key to the simplicity lies in the fact that the actual optimizations are performed on our freely available demo Optimization Server (OptServer) running at On that page you can also find instructions for invoking remote optimization in all other MOSEK interfaces; that's how the Python notebooks calls the OptServer. The Javascript package runs the Fusion layer modeling in the browser and sends a JSON task object for remote optimization.

Our online demo server has problem size limits. If you like the concept of remote optimization it is almost as easy as this to set up your own demo OptServer in a docker container and use that to optimize.

Monday, November 16, 2020

Apple Silicon M1 plans

In response to the recent questions about MOSEK support for the new Apple Silicon M1 ARM64 architecture:

We plan to support the new Apple chips in the next major release, that is MOSEK 10. The release date is not yet set. Until we have access to the new platform we cannot make any guarantees regarding the MOSEK performance.

Please contact us if you have questions or plans related to MOSEK on the new Mac.

The MOSEK team

Monday, October 19, 2020

Fusion and the art of harvesting potatoes

Week 42 in Denmark was "kartoffelferie" - fall school vacation originally designed to allow children help with the potato harvest on the farm. The vacation is still present even though kids no longer collect potatoes, except for...

... Lærke. Lærke has her own little potato field arranged in an $n\times n$ grid. For some squares of the grid she knows the exact number of potatoes that can be harvested from that square, between $0$ and $3$. She wants to design a round trip around her little field so that each of those squares will be passed by just the right number of times to pick the potatoes, one on each pass. Other squares can be passed by any number of times. A picture with a sample layout and its solution is worth a thousand words:

Now Lærke notices that she can easily write a mixed-integer model of her problem. She will associate a binary variable with each edge of the grid, to indicate if it appears in the tour, and then needs to ensure that:

  • (1) every vertex of the grid meets either 0 or 2 of the tour edges,
  • (2) every square with $i$ potatoes has exactly $i$ tour edges on its boundary.
That will not necessarily give Lærke one closed tour, but possibly a disjoint union of cycles. A subtour elimination scheme similar to the one used for TSP could be built on top to get a single tour. Luckily, Lærke is not so worried about that - her basket is quite small so she is happy to make a few trips anyway. What she is much more interested in is that each of the constraints (1) and (2) can be implemented basically as a one-liner in Fusion by appropriate stacking and slicing and that she can see it all and try it out live here:

Finally, as much as Lærke is just our imaginary MIP user in the food industry, the puzzle actually exists and is known as Slitherlink.

Bon appétit!

(images by GLmathgrant, under CC BY-SA 3.0, from Wikipedia)

Friday, October 9, 2020

CVXPY 1.1.6

The latest release 1.1.6 of CVXPY includes a completely rewritten MOSEK interface. It will, in particular, reformulate conic and linear problems in a different way than until now before passing it to the solver, so you may experience different behavior, especially in numerically challenging cases.

Here is roughly what you should expect. If you are modeling:

  • a mixed-integer problem: no change.
  • a sparse LMI: the new version should be much more efficient, both in terms of CVXPY modeling and solution time (the problem is dualized before calling MOSEK).
  • all other linear and conic problems: the problem will be dualized before calling the solver. Ideally you should see no difference or perhaps an improvement, but it can go either way, especially if the dualized version is more challenging for any reason. In many cases MOSEK will internally choose the correct form to solve. In the remaining cases it may be necessary to explicitly force the solver to dualize.

    If you were already tuning some MOSEK parameters then you should reevaluate the settings. In case you were already forcing primal/dual form with iparam.intpnt_solve_form then most likely you should change to the opposite.

    Finally, if you are studying the solver log output, keep in mind that it is operating with the dual of your CVXPY formulation (DUAL_INFEASIBLE means your problem is primal infeasible, minimization becomes maximization etc.). 

See also a note in CVXPY docs.

Monday, October 5, 2020

Sharpe ratio - derivation and Fusion model

We are being frequently asked about the Sharpe ratio, its formulation in the conic framework, and implementation in Fusion. This involves a class of problems with an objective of the type $$\mathrm{maximize}_x\quad \frac{r^Tx-r_f}{\|Fx\|_2}$$ i.e. an affine function over a 2-norm, where $r^Tx-r_f>0$, $x\in\mathbb{R}^n$. In practical portfolio optimization $r$ would be the vector of expected returns, $r_f$ is the risk-free return rate, $x$ is the vector of asset allocations and $\|Fx\|_2 = \sqrt{x^T\Sigma x}$ is the risk associated with the covariance matrix $\Sigma$ (formulating the risk term as a 2-norm is standard and we don't go into details, see here or here). Typically there will be various constraints on $x$, for example $$\mathbb{1}^Tx=1,\ x\geq 0$$ would correspond to a fully invested portfolio with no short-selling.

Let us explain step by step how one can derive a conic formulation of (1) suitable for a solver like MOSEK. We present it in detail so that the reader can apply it almost verbatim to more complicated models of this kind. First, note that if we could fix $$r^Tx-r_f=\mathrm{const}$$ then the objective would be equivalent to minimizing $\|Fx\|_2$, that is a standard second-order cone problem. However, we don't know in advance what "const" should be. (In fact solving the problem for all values of "const" corresponds to computing the efficient frontier.) Therefore, for reasons which will become clear in a moment, we denote the "const" value by $1/z$, where $z\geq0$ is a new scalar variable, i.e. $$r^Tx-r_f=1/z$$ that is $$zr^Tx-r_fz=1.$$ Denoting $y=zx$ ($y$ is now a new vector variable) the last equation becomes $$r^Ty-r_fz=1.$$ Now $x=y/z$ and the objective function becomes $$\frac{r^Tx-r_f}{\|Fx\|_2} = \frac{1/z}{\|F\frac{y}{z}\|_2} = \frac{1}{\|Fy\|_2}$$ hence we can write the original problem as $$\begin{array}{rl}\mathrm{minimize}&\|Fy\|_2\\ \mathrm{s.t.} & r^Ty-r_fz=1,\\ & z\geq 0.\end{array}$$ Note that the new problem involves variables $y$ and $z$, but $x$ has been eliminated. Any additional constraints must also be reformulated by substituting $x=y/z$, for example $\mathbb{1}^Tx=1,\ x\geq 0$ becomes $$\mathbb{1}^Ty=z,\ y\geq 0$$ and in fact any other linear constraint $Ax=b$ becomes $$Ay=bz.$$ A solution $(y,z)$ to the reformulation gives a solution $x=y/z$ to the original problem (assuming that the problem has a feasible point with $r^Tx-r_f>0$, so that the reformulation has a solution with $z>0$).

Certain other types of constraints can also be carried through the reformulation. For example a cardinality constraint on $x$ (at most $k$ entries in $x$ are nonzero) can be imposed on $y$ using the same mixed-integer model. Another quadratic bound of the form $\|Hx\|_2\leq 1$ will become $\|Hy\|_2\leq z$ using $x=y/z$.

A sample Fusion implementation can be found here: