The MOSEK office, sales and support are not available
December 24-28th and 31st, January 1st.
The MOSEK team
The MOSEK office, sales and support are not available
December 24-28th and 31st, January 1st.
The MOSEK team
While there is still some summer temperatures here in Copenhagen, a certain crispness in the signals that autumn is here!
For MOSEK the arrival of autumn means the arrival of of new persons eligible to use MOSEK for free.
This is because MOSEK, through our academic initiative, grants free licenses for research or educational purposes. By following the link you can see if you are eligible and request an academic license.
The academic license gives access to the full functionality of MOSEK.
Whether you are a new student or a seasoned academic why not use this opportunity to use MOSEK!
MOSEK is now available on AWS marketplace as an AMI (Amazon Machine Image). What that means is that you can initiate an instance with MOSEK installed.
Well actually, you have to install MOSEK on the machine, but there is a script on the AMI that can do that for you!
What is actually pre-installed on the AMI is a MOSEK license. That means you can run MOSEK without having to worry about license files and MAC- addresses.
Currently we starting off with a limited number of supported instances, but we look to expand on that offering in the near future.
Since we are still in an experimental mode we are happy to receive any feedback you might have. It can be about which instances you would like us to support or comments on the documentation and user experience, either way we would like to hear from your!
As always you can reach us at support@mosek.com.
A few years a ago we introduced an official rust API for MOSEK. The API extended our optimizer API to the rust language. The optimizer API is designed to be a thin interface to the native C optimizer API.
However, MOSEK has in addition to the optimizer API also the fusion API. The fusion API is specifically designed to build conic optimization models in a simple and expressive manner. With its focus on model building the fusion API acts as a compliment to the optimizer API. That compliment has now been extended to Rust.
However, due to some language specific attributes of Rust in combination of how the fusion API is constructed there is not a straight forward way to extend the fusion API to Rust. Hence, we like to introduce mosekcomodel!
The Rust crate mosekcomodel is a Rust package for formulating and solving convex conic optimization models. While it looks somewhat like the MOSEK fusion API (for Python, Java, .NET and C++), it is a
fully Rust native package exploiting Rust's type system zero-cost abstractions to make it simpler and faster to write correct models.
The crate provides a model-oriented interface for defining conic models and a
library of functionality for formulating affine expressions.
As highlighted in a recent linkedIn post by Elmor Peterson Geometric Programming (GP) is an old subfield within optimization with applications in integrated circuit design, aircraft design and control theory amongst others.
Although GP models them self are not convex they can always be converted into a convex model.
To facilitate GP modeling we have made a GP toolbox that makes these transformations for you. To learn more about GP and play around with the toolbox check out the Marimo notebook.
Credit for the notebook goes to our student worker Izgi Tulunay.
The MOSEK office received our copy of Dany Cajas book 'Advanced Portfolio Optimization' fresh off the press today!
![]() |
First impressions are really positive :)
Looks like it should be a great resource for many MOSEK users.
On that note we recently added a Books side on our website.
Where we we link to the book. There you can also find our own 'MOSEK Modeling Cookbook' and 'MOSEK Portfolio Optimization Cookbook'.
We free to reach out to support@mosek.com if you have a suggestion on books we should add.
The new prices will come into effect on September 1st, 2025.
The price for the basic PTS and PTON floating licenses increases with 100 USD each. Our other prices follows accordingly. With NODE licenses costing 4 times the price of their floating license counterpart and the annual maintenance 25% of the base price of the part.
This equates to a price increase of 4.9% on average.
The new prices can be found at the top of our commercial pricing page on our website.
Support and sales are closed during Easter from Thursday 17th until Monday 21st of April, both days inclusive.
Happy $\pi$ Day! If other methods fail, you can always compute $\pi$ with the MOSEK semidefinite optimizer. We leave the details as an interesting exercise for the curious readers. Some hints are hidden in our Modeling Cookbook .
Due to popular demand we present the full modern installation process of CVX+MOSEK. It works the same way on all platforms supported by MOSEK.
If you experience issues with CVX+MOSEK please reinstall from scratch following these instructions. If you already did that, and there are still issues then please contact us with your platform, MOSEK version, license type, and an explanation of which step failed including full log/error messages.
MOSEK support is unable to help with old, broken, manually altered and other CVX installations that didn't follow this process. In particular please don't use the older 2020 CVX version which comes with included, now quite outdated, MOSEK 9.1.
Step 1. Installing CVX
Solving a mixed integer programming (MIP) problem can be extremely time-consuming using the so-called brand and bound algorithm. Therefore, a MIP solver like MOSEK incorporates a lot of algorithmic improvements to reduce the solution time. Sometimes those improvements are called tricks.
Now to evaluate whether some trick benefits the MIP solver, then the MIP solver with and without the trick included is used to solve a benchmark set of test problems and if the benchmark results indicate the trick helps, then it is included in the solver.
Clearly, if all the test problems are solved on one specific computer, then the timing results are comparable. However, to make robust conclusions then a lot of carefully selected test problems must be employed. This has the unfortunate consequence that evaluating a new trick is very time-consuming. The benchmark problems can of course be solved in parallel, but solving multiple problems in parallel on one computer will not produce reliable timing results that can be compared. The only way of getting comparable timing results quickly is to solve many problems in parallel on a cluster of identical computers.
That is why we at MOSEK recently invested 55K+ USD in a compute cluster made up of identical computers for the MIP development team. The cluster consists of 4 boxes each containing 8 computational nodes i.e. it provides 32 identical computers.
Hopefully, you won't have to wait too long to see the benefits in Mosek arising from faster testing of potential new improvements in the MIP solver!
In the meantime, you can enjoy this photo of our cluster working away :)
Let us begin by answering the question in the title which is no. This post presents arguments behind that answer.
In a recent substack post, Ben Recht mentioned that Mosek and optimization software in general have many algorithmic parameters that can be tweaked. Clearly, it may be overwhelming for the average user of optimization software to figure out what the parameters do and how to change them.
Note that the types of parameters that are referred to are those that affect the algorithms of the optimizers and not the parameters that affect things like the amount of log output.
However, in the case of Mosek a good rule of thumb is to only tweak the parameters if there is a very good reason to do so. The main reasons for this are:
In 2018, we challenged our community to find convex functions that could not be reformulated in terms of the cones supported in MOSEK. A long-standing contender for this challenge was
$$ \frac{1}{x^4 + x^2}\quad \text{ for } x \geq 0, \label{eq:challenge}$$
but as of today, we are finally able to present a conic reformulation of this function guided by past experiences. In this blogpost we take a look at the insights that enabled this.
Before diving into the details, however, we would like to clarify that conic optimization is able to encompass any convex function, including \eqref{eq:challenge}, via perspective reformulation. The problem, however, is that the resulting cones have widely different computational properties (some even being NP-hard to optimize over!). Restricting attention to the set of cones supported in MOSEK, we effectively limiting ourselves to representations with practical large-scale performance. As suggested by the community challenge, however, it turns out to be quite hard to find convex functions without a computationally efficient conic representation.
So what makes \eqref{eq:challenge} so difficult?
Insight 1: Polynomial denominators
For a polynomial $g(x)$ with real roots $g_1 \leq \ldots \leq g_k$, we can represent
$$t \geq \frac{1}{g(x)} = \frac{1}{\alpha(x - g_1)\cdots(x - g_k)}$$
on a convex subinterval, $[g_p, g_{p+1}]$, via a single power cone
$$ t \alpha (x - g_1)\cdots(x - g_p)(g_{p+1} - x)\cdots(g_k - x) \geq 1,$$
by flipping signs of the factors, $(x - g_i)$, so that they are all nonnegative on the subinterval of interest. This method can also represent the left and right tail, $[-\infty, g_1]$ and $[g_k, \infty]$, if convex. The issue with $g(x) = x^4 + x^2 $ is that it has a complex conjugate root pair, $\pm i$, and so the closest we can get in this direction is
$$ t x^2 (x^2 + 1) \geq 1. \label{eq:step1}$$
Insight 2: Nonlinear substitution
A natural first attempt would be to substitute out the convex term, $x^2 + 1$, from \eqref{eq:step1} using
$$ t x^2 r \geq 1,\quad r \geq x^2 + 1,$$
but this fails since $r \rightarrow \infty$ satisfy both constraints without enforcing the desired relation. We thus resort to a lesser known trick, namely to rewrite $x^2 + 1 = f(f^{-1}(x^2 + 1))$ and then substitute out $f^{-1}(x^2 + 1)$. With this in mind, it would be natural to try
$$ t x^2 r^p \geq 1,\quad r \leq (x^2 + 1)^{1/p},$$
for some suitable power $p$, but this fails because $(x^2 + 1)^{1/p}$ is not concave on all of $x > 0$ for any $p \geq 1$. Nevertheless, with a bit of persistence, we finally manage to satisfy all composition rules with
$$ t x r^4 \geq 1,\quad r \leq (x^3 + x)^{1/4}, \label{eq:step2}$$
where the latter is a convex set in need of a conic reformulation.
Insight 3: Signed sum-of-quartics decomposition
The signed sum-of-squares decomposition was introduced in a previous talk on our MOSEK channel, as a new reformulation tool with great applicability. In case of $(x^3 + x)^{1/4}$, the fourth root suggests need of a signed sum-of-quartics instead. We do not know of any algorithm to perform this decomposition, but using the method of undetermined coefficients (guessing the final form, and mapping coefficients via quantifier elimination), we obtain
$$ x^3 + x = (ax + a)^4 - (ax - a)^4,$$
where $a = \frac{1}{2^{3/4}}$. This allows us to rewrite \eqref{eq:step2} as
$$ t x r^4 \geq 1,\quad r^4 + (ax - a)^4 \leq (ax + a)^4, \label{eq:step3}$$
where the former is a power cone and the latter is the p-norm cone of order 4, which is a well-known conic representable set (we refer to the MOSEK Cookbook).
This concludes our reformulation from the convex function \eqref{eq:challenge} to the conic representation \eqref{eq:step3}. In time, we shall see if this process extends to a wider range of polynomial denominators.
MOSEK 11 has now been released!
You will find all the necessary information, installation instructions and documentation on the website of the new version. To try it out you must obtain a new license file, either by applying/reapplying for a trial/academic license on our website or contacting our sales if a customer.
The main features of MOSEK 11 are
A frequently asked question by MOSEK users is whether the interior-point optimizer can be warm-started. The short answer is that no warm-start is available for the interior-point optimizer.
The reason for the lack of a warm-start capability is that no general-purpose interior-point warm-starting method that
The paper contains some limited computational evidence that an interior-point warm-start may be possible. Note that the paper shows that in the best case where both a good primal and dual solution guesses are available, the number of iterations is reduced by 50%. However, the paper does not consider the complications of integrating the warm-start with the presolve procedure in a commercial optimizer like MOSEK. Also, an interior-point method benefits a lot from the presolve and which has a fairly computationally expensive setup step that is independent of a warm-start. Hence, a large reduction in the number of iterations does not translate into an equally large decrease in the optimizer time due to a large initialization cost.
Based on the discussion above it can be concluded that how to warm-start an interior-point optimizer is an open question. Furthermore, obtaining a 50% reduction in the optimizer time by wam-starting is unlikely even if it was possible to warm-start. Something like from 0 to 25% reduction in the optimizer time is much more likely if a good warm-start method was known.