Monday, November 8, 2021

Some animals are more equal than others: least-squares vs Euclidean norm

A user recently reached out to MOSEK support because the solver returned an UNKNOWN problem/solution status on their (constrained) Least-squares problem, that was implemented in CVXPY . The solver log looked as shown in the first MOSEK log shown below.

The solution we offered was to minimize the Euclidean norm instead of the sum-of-squares. Theoretically, both models are equivalent; their optimal points are the same. But as a great man once said: in theory, theory and practice are the same. In practice, well... Take a look at the MOSEK log for the problem that minimizes the Euclidean norm (second MOSEK log)

Note the following differences between the two MOSEK logs:

  • The problem status in the norm problem is primal and dual feasible and the solution status is optimal, in contrast to the least-squares problem.
  • The infinity-norm of the solutions in the norm problem are smaller by nearly 5 orders of magnitude compared to the solution of the least-squares problem. This is usually a desirable trait, as explained in the  debugging section of MOSEK docs 
  • The number of iterations taken by the norm problem is far fewer.
  • The objective value of the norm problem is essentially the square root of the least-squares problem (the least-squares problem was not solved to optimality, hence the discrepancy) 
Observe that the solution is only guaranteed to be unique if the objective is strictly convex. The objective function is strictly convex if and only if the A matrix (following  this notation) is of full column-rank. If this is not the case, then the two problems might output different optimal solution vectors (as is the case above), but as Orwell would put it, one of those solutions will be more equal than the other

So, at least for CVXPY-MOSEK users (and MOSEK Fusion users!), we recommend the Euclidean norm in place of the sum-of-squares, wherever possible.

Tuesday, September 14, 2021

Portfolio Optimization Workshop and Cookbook

We are pleased to announce the MOSEK Portfolio Optimization Workshop, a one-day event on the theoretical and practical aspects of portfolio optimization using MOSEK. The workshop takes place at our location in Copenhagen (this is NOT a virtual event(!)) on Thursday, November 18th, 2021. Topics include:

  • An introduction to portfolio optimization using MOSEK
  • Advanced topics in portfolio optimization
  • Tracking error and portfolio construction
  • MOSEK licensing, developments and news.
There will also be ample time for discussions. We provide lunch for the participants. For the full program with abstracts see the poster:


Participation is free and open to all but we would like you to register via this form in order to keep track of numbers.

Announcing the workshop is also an opportunity to present the first version of the MOSEK Portfolio Optimization Coookbook, which provides an introduction to the topic of portfolio optimization and discusses several branches of practical interest from this broad subject illustrated with examples using the MOSEK Fusion API. For more information about this topic, including links to the cookbook and accompanying Python notebooks visit our comprehensive Portfolio Optimization Resource Page.

Friday, August 6, 2021

MOSEK 9.3 is released

We have released MOSEK 9.3. Release notes:

https://docs.mosek.com/9.3/releasenotes/index.html

This version is a direct continuation of 9.2 with no changes to existing interfaces. The new features are:

  • support for Linux on ARM64 (aarch64), including Optimizer API and Fusion API for C/C++,Java,Python,.NET. The interior-point and conic optimizers are single-threaded on that platform.
  • Updated FLEXlm to version 11.18. In particular, floating license users who upgrade clients to MOSEK 9.3 must also upgrade license server binaries (lmgrd) to the ones from the 9.3 MOSEK distribution for compatibility. The new license server is as always backwards compatible.
  • Improved performance when solving many tasks in parallel.

Wednesday, June 23, 2021

MOSEK 9.3 beta, support for linuxaarch64

We have released a beta version of MOSEK 9.3. It is currently only available for download from our website

https://www.mosek.com/downloads/9.3.0/

Release notes:

https://docs.mosek.com/9.3/releasenotes/index.html

This version is a direct continuation of 9.2 with no changes to existing interfaces. The new features are:

  • support for Linux on ARM64 (aarch64), including Optimizer API and Fusion API for C/C++,Java,Python,.NET. The interior-point and conic optimizers are single-threaded on that platform.
  • Updated FLEXlm to version 11.18. In particular, floating license users who upgrade clients to MOSEK 9.3 must also upgrade license server binaries (lmgrd) to the ones from the 9.3 MOSEK distribution for compatibility. The new license server is as always backwards compatible.
  • Improved performance when solving many tasks in parallel.

Friday, June 4, 2021

Price increase from October 2021

In February 2020 we announced a price increase, which was later postponed due to the COVID-19 events.

This price increase will now come into effect from October 1st, 2021

Prices go up 5.4% on average: the basic PTS and PTON floating license prices increase by 100 USD each, and the remaining values are adjusted appropriately, following our common rule that a server license is worth 4 times the floating license and that annual maintenance for each part is 25% of the base price for the part. 

New prices can be found at the top of

https://www.mosek.com/sales/commercial-pricing

The current prices remained unchanged since 2015.

Tuesday, April 20, 2021

Data-driven distributionally robust optimization with MOSEK

We posted a video where our very own Utkarsh Detha accompanied by the authors of the award-winning paper  "Data-driven distributionally robust optimization using the Wasserstein metric: performance guarantees and tractable reformulations", Assistant Prof. Peyman Mohajerin Esfahani and Prof. Daniel Kuhn, discuss the work presented in that paper.

We also show how to quickly and easily implement such an approach using our Fusion API for Python. This video focuses on the key new feature: parameters in Fusion. Parametric Fusion allows MOSEK to rapidly resolve a model, and combined with the warm-start capability of the Simplex optimizer, it becomes a powerful tool in every optimizer's garage.

See: the video, our notebook on the same topic, the research paper.


Monday, December 14, 2020

MOSEK in the browser and OptServer

Although both installing MOSEK and obtaining a trial license are very easy, it is now possible to get to know MOSEK directly in the browser with one click and no setup at all.

Python

With a Google account you can run MOSEK in Google Colab notebooks. Click below to open the sample MOSEK Colab notebook in your user space and run it.


(You can also just download the notebook and run it in Jupyter or your favorite environment, but that's more than one click). Since this is a complete MOSEK installation in Python, the full Optimizer and Fusion APIs are available.

Javascript Fusion

The Fusion API is experimentally available in Javascript. Just click below to open the editor in the browser and start coding and solving optimization problems in our Fusion API.


An almost complete Fusion API is mapped through and you can run a number of ready examples.

How does it work?

The key to the simplicity lies in the fact that the actual optimizations are performed on our freely available demo Optimization Server (OptServer) running at https://solve.mosek.com. On that page you can also find instructions for invoking remote optimization in all other MOSEK interfaces; that's how the Python notebooks calls the OptServer. The Javascript package runs the Fusion layer modeling in the browser and sends a JSON task object for remote optimization.

Our online demo server has problem size limits. If you like the concept of remote optimization it is almost as easy as this to set up your own demo OptServer in a docker container and use that to optimize.