Monday, December 18, 2023

Christmas lights

In this year's Christmas blogpost, we explain how to properly hang the Christmas lights on the Christmas tree ๐ŸŽ„.

Let us fix some notation. The ๐ŸŽ„ in dimension $D\geq 2$ is (surprise!) a reflected and rescaled quadratic cone:

$$(x_1,\ldots,x_D) \in \unicode{x1F384} \subseteq\mathbb{R}^D\quad  \iff\quad W(1-x_D/T)\geq \|x_{1:D-1}\|_2,\ x_D\geq 0$$

where $T$ is the coordinate of the top and $W$ is a width rescaling parameter. The tree comes decorated with $P$ ornaments located at $c_1,\ldots,c_P\in\unicode{x1F384}$. Here is an example in dimension $D=2$.

The Christmas lights chain consists of $N$ bulbs evenly spaced in distance $L$ along the wire. Their  locations $x_1,\ldots,x_N\in \unicode{x1F384}$ are to be determined.  If, like us, you have been collecting and repairing your lights for almost 25 years then it is possible that each of the lightbulbs shines with different power $I_1,\ldots,I_N$.

The main requirement we have is that each of the ornaments is nicely illuminated, more precisely we wish to maximize the total amount of light received by the worst-illuminated ornament. Let $\mathrm{il}_j(d)$ be the amount of light that a point in distance $d$ from the light source receives from the $j$-th lightbulb. Obviously $\mathrm{il}_j(d)$ is a decreasing function of the distance $d$ to the bulb. That leads us to the optimization problem:

$$\begin{array}{llrr}\mathrm{maximize} & \mathrm{il}_{MIN}  & &\\ \mathrm{s.t.} & \mathrm{il}_{MIN}\leq \sum_{j=1}^N \mathrm{il}_j(d_{ij}) & i=1,\ldots,P & (2.1) \\ & d_{ij}\geq \|x_j-c_i\|_2 & i=1,\ldots,P,\ j=1,\ldots,N &(2.2) \\ & \|x_j-x_{j+1}\|_2\leq L & j=1,\ldots,N-1 & (2.3) \\ & x_j \in \unicode{x1F384} & j=1,\ldots,N &(2.4) \\ & x_1 = (W,0,\ldots) & &(2.5)\end{array}$$

In principle we should let the brightness of the $j$-th bulb decay as $\mathrm{il}_j(d)=I_j/d^{D-1}$, but that does not lead to a convex constraint in (2.1). Instead, let us use a piecewise-linear approximation $\widetilde{\mathrm{il}_j}(d)=\mathrm{max}(I_i-d,0)$. This function is not concave either, so it cannot be used directly in (2.1) in a continuous fashion, but when written as

$$(d\in[0,I_j]\ \mathrm{and}\ \widetilde{\mathrm{il}_j}(d) = I_j-d)\ \mathrm{or}\ (d\geq I_j\ \mathrm{and}\ \widetilde{\mathrm{il}_j}(d) = 0)$$

it becomes evidently definable via a disjunctive constraint. Here is a rough comparison between $\mathrm{il}_j(d)$ and $\widetilde{\mathrm{il}_j}(d)$:

Let us quickly recap problem (2): constraint (2.1) (with $\widetilde{\mathrm{il}_j}$ in place of $\mathrm{il}_j$) computes the approximate amount of light received by each ornament as a function of the distance $d_{ij}$ from the $i$-th ornament to the $j$-th bulb; that distance is computed in (2.2). Two consecutive lightbulbs cannot be spaced by more than the wire length between them, as in (2.3), while (2.5) ensures that the chain starts at the fixed bottom point of the tree, where the electric socket is. The complete MOSEK Fusion model can be found on GitHub; it is a mixed-integer (due to the disjunctive constraints) second-order cone optimization problem.

We end with some examples. The ornaments have been given intensity according to the amount of light received. The disk sizes correspond to the brightness of lightbulbs.


It remains to wish all readers a very well-lit ๐ŸŽ„ and Happy New year 2024!

The MOSEK team

Tuesday, November 28, 2023

MOSEK in Abu Dhabi

Next week MOSEK will be in Abu Dhabi attending the dual conference Converging paths in commodity and financial market analysis. In consist of the established conferences Research in Options and Euro Working Group for Commodity and Financial Modelling

If you are going to the conference or if you are based in Abu Dhabi and would like to discuss MOSEK, don’t hesitate to reach out to sales@mosek.com.

See you in Abu Dhabi!

Monday, October 9, 2023

Irreducible Infeasible Subsets

When an optimization problem is infeasible it is common to ask for the reason for infeasibility. It is very typical that the model is infeasible because it contains a set of mutually contradicting constraints which is relatively small compared to the total size of the problem. Locating and analyzing such a subset is usually helpful in understanding the reasons for infeasibility.

Such considerations have led the OR community to come up with the notion of Irreducible Infeasible Set (IIS) - a subset of constraints which is infeasible but cannot be reduced to a smaller infeasible set (all its proper subsets are already feasible). Every infeasible problem has an IIS - possibly many, possibly of different sizes. For example the problem bgdbg1 (see below) has a few hundred bounds but its infeasibility is caused already by the small segment

X104_JY >= 0.0
X104_JH >= 36.0
X104_JY + X104_JH <= 6.0

consisting of 3 bounds, which is an IIS. In fact the same model has also another IIS

X199_CM1 >= 96.0
X199_CM1 <= 86.0

with just 2 bounds (it doesn't get any better - a single linear bound is always feasible on its own).

An IIS can be found by a simple iterative algorithm which tries to remove one bound at a time and checks if the problem remains infeasible. We have implemented it for linear and mixed-integer linear problems in a recently published python notebook, where you can also read more details of the algorithm and its implementation. You can also download the python code directly

Below are the results produced by our sample code on the infeasible problems from NETLIB. We report the number of bounds in the problem, the size of the (Farkas) infeasibility certificate (these are the bounds that will appear in the infeasibility report or in the task.getinfeasiblesubproblem() function) and finally the size of the smallest IIS out of 5 runs of the algorithm (the IIS found depends on an ordering of the bounds, which we randomize).

bgdbg1.gz: OK, all bounds = 926, Farkas size = 2, IIS size = 2
bgetam.gz: OK, all bounds = 1577, Farkas size = 31, IIS size = 25
bgindy.gz: OK, all bounds = 14694, Farkas size = 196, IIS size = 154
bgprtr.gz: OK, all bounds = 68, Farkas size = 12, IIS size = 10
box1.gz: OK, all bounds = 723, Farkas size = 10, IIS size = 9
ceria3d.gz: OK, all bounds = 3576, Farkas size = 117, IIS size = 198
chemcom.gz: OK, all bounds = 1416, Farkas size = 86, IIS size = 37
cplex2.gz: numerical issues, all bounds = 733, Farkas size = 177, IIS size = 715
ex72a.gz: OK, all bounds = 609, Farkas size = 61, IIS size = 60
ex73a.gz: OK, all bounds = 597, Farkas size = 28, IIS size = 25
forest6.gz: OK, all bounds = 196, Farkas size = 96, IIS size = 92
galenet.gz: OK, all bounds = 26, Farkas size = 5, IIS size = 5
gosh.gz: OK, all bounds = 15353, Farkas size = 9, IIS size = 9
gran.gz: OK, all bounds = 8386, Farkas size = 2, IIS size = 2
grbeai.gz: OK, all bounds = 10402, Farkas size = 93, IIS size = 54
itest2.gz: OK, all bounds = 13, Farkas size = 3, IIS size = 3
itest6.gz: OK, all bounds = 21, Farkas size = 3, IIS size = 3
klein1.gz: OK, all bounds = 108, Farkas size = 55, IIS size = 54
klein2.gz: OK, all bounds = 531, Farkas size = 57, IIS size = 55
klein3.gz: numerical issues, all bounds = 1082, Farkas size = 97, IIS size = 192
mondou2.gz: OK, all bounds = 1832, Farkas size = 24, IIS size = 23
pang.gz: OK, all bounds = 942, Farkas size = 34, IIS size = 32
pilot4i.gz: OK, all bounds = 1886, Farkas size = 43, IIS size = 47
qual.gz: OK, all bounds = 1360, Farkas size = 217, IIS size = 370
reactor.gz: OK, all bounds = 1652, Farkas size = 15, IIS size = 9
refinery.gz: OK, all bounds = 1360, Farkas size = 206, IIS size = 120
vol1.gz: OK, all bounds = 1360, Farkas size = 275, IIS size = 196
woodinfe.gz: OK, all bounds = 173, Farkas size = 2, IIS size = 2

The work reported here and in the notebook was carried out by Adam Bosรกk during his student internship at MOSEK. Thanks!

Informs Annual meeting

Next week Informs annual meeting will take place in Phoenix and MOSEK will of course be there. If you are going to Informs annual meeting and would like to discuss MOSEK, don’t hesitate to reach out to sales@mosek.com.

See you in Phoenix!

Monday, October 2, 2023

Autumn and MOSEK

The summer has been long here in Copenhagen, but now it feels like the fall is here to stay. For MOSEK the arrival of autumn means the arrival of thousands of new persons allegeable to use MOSEK for free.
This is because MOSEK, through our academic initiative, grants free licenses for research or educational purposes at degree-granting academic institutions.

Whether you are a new student or a seasoned academic why not use this opportunity to try MOSEK!

 

Thursday, September 21, 2023

Apple Silicon and Matlab R2023b

The MATLAB R2023b is the first stable release with native support for the Apple Silicon M1/M2 platform. 

The MOSEK Optimization Toolbox for MATLAB is available natively for Apple Silicon from MOSEK 10.1. It means that from version 10.1 there are two ways to use MOSEK in MATLAB on the M1/M2 platform:

  • (natively) Use the Apple Silicon release of MATLAB (version 2023b+, architecture MACA64) and the osxaarch64 MOSEK package, with its included toolbox for MATLAB.
  • (via Rosetta, the "old" way) Use the Intel release of MATLAB (any version, architecture MACI64) and the osx64x86 MOSEK package with its included toolbox for MATLAB. 
In other words, the architecture of MOSEK should match the architecture of the MATLAB installation.

The "old" way, with emulation via Rosetta, may be relevant for users of binary MATLAB packages which do not (yet) have native releases for M1/M2 but only for the Intel-based Macs. This is (currently) the case, for instance, with CVX.

Monday, September 18, 2023

MIP improvements in 10.1

 

In a previous blogpost we talked about additions made to MOSEK with the release of version 10.1. However, as usual with new releases improvements in the existing functionality have also been made.

One area where we have improved a lot with our latest release is with our mixed integer solver. On our internal benchmarking set we have seen an improvement of over 30 per cent on the geometric mean of the solution time compared to MOSEK 10.0.

Does this mean you will see a 30 per cent improvement on your mixed integer problems? No, or yes or maybe… It depends!

Mixed integer problems are hard, NP-complete in fact, and in the design of the solver algorithms there are tradeoffs to be made. These tradeoffs will make the solver good for certain problems and less so for others. Tuning the solver will determine which problems it will perform well on and on which it will not.  

At MOSEK we tune our solvers based on our internal benchmarking sets. These sets are mainly based on problems we have received from customers, and we feel they are representative of the problems being solved for practical applications.

So, does this mean you will see a 30 per cent improvement on your mixed integer problems? No, not for certain. But it might be worthwhile to upgrade to 10.1 and test and see!

Monday, September 11, 2023

Remote optimization with OptServer(Light)

Together with the recently released MOSEK 10.1 we introduced OptServerLight - a light version of a remote optimization server. Let us take this opportunity to write a few words about remote optimization in MOSEK and introduce the two variants of the OptServer.

In a typical scenario the user sets up an optimization problem in MOSEK (through an API, or by reading a file), solves it, and retrieves the solution. All steps are executed in one process, on the same machine. Under remote optimization the user still sets up the problem locally (in the client), but the optimization itself (the call to MSK_optimize(), optimize(), solve() or similar) is being executed on a remote machine (server), before the user can again retrieve the solution in the local process on the client. In practice the client submits the problem data to the server via HTTP/HTTPS, the server calls its own instance of MOSEK to solve the problem, and transmits the solution back to the client in response.

From the client's point of view the process is completely transparent. All it takes is to instruct MOSEK to contact the remote server instead of optimizing locally. Everything else - setting up the problem, retrieving the solution, printing log output - remain unchanged. Here is an illustration in Python Fusion:


The client code is the same for local and remote optimization, except for the one line which introduces the address of the remote OptServer. This code will, in fact, work, because the URL points to an existing OptServer which we made available online and which can solve small problems for demonstration purposes. See https://solve.mosek.com for more information and downloadable code samples.

Why/when use the remote OptServer?
  • One centralized computational server where all heavy computations are offloaded by clients.
  • The problems are too big/time consuming for weak client machines.
  • No need for the clients to have the MOSEK license, only the server requires a license.
  • Solving problems produced by a different source than a MOSEK client.
  • Load and user access control.
We provide two flavors of the OptServer:
  • OptServerLight - a minimalistic binary. Shipped in the distribution, started directly from command line with minimal or no configuration, works out-of-the-box, available for all platforms. Keeps no state, works in-memory, suitable as a simple solver service in container pipelines. Very basic load balancing and configuration are available. Recommended as a starting point and probably sufficient for a majority of applications. See the brief startup notes.
  • The full OptServer. In addition to the solver service provides a user API, job history, authentication, API tokens, various levels of permissions, administrator/user accounts, statistics, web interface and more. Requires a more elaborate setup, including configuring a database. See the demo docker image for a stand-alone container and installation instructions.

Monday, August 14, 2023

All Rustaceans can now enjoy Mosek from a fully supported official API

Rust is a fairly new programming language that is gaining momentum due to its performance and memory safety.
Hence it is time to make a Rust interface available to MOSEK.
Therefore, the recently released MOSEK version 10.1 includes the Optimizer API for Rust that provides
low-level access to all functionalities of MOSEK. It is essentially a thin interface to the native C optimizer API that has almost zero overhead.

Have a look at our API guide for code snippets and instructions of how you best solve your optimization problems with Rust and MOSEK

To get started download the MOSEK crate


Monday, June 19, 2023

New Portfolio Optimization Cookbook chapters and notebooks

We are happy to share that our Portfolio Optimization Cookbook has been updated and new chapters has been added. We have also added new notebooks with code examples.
The additions cover among other things:
  • CVaR
  • EVaR
  • Gaussian mixture return model
  • Risk budgeting
  • Robust optimization
  • Multiperiod optimization
As usual the new concepts are implemented in Mosek Fusion for Python
Enjoy! 

Tuesday, June 6, 2023

MOSEK 10.1 - more platforms and OptServerLight

We are delighted to announce MOSEK 10.1 (Beta). It is a direct continuation of version 10.0 which adds support for:
  • Python 3.11
  • OptServerLight - a lightweight remote optimization server.
  • Rust and Julia Optimizer APIs as official APIs.
  • Native M1 Mosek Optimization Toolbox for Matlab.
You can read more and find all the information, documentation and downloads on the latest release page.

Monday, April 24, 2023

SIAM Conference on Optimization (OP23)

MOSEK is proud to be one of the sponsors of the SIAM Conference on Optimization (OP23). SIAM OP23 will be held on May 31 - June 3, 2023, at the Sheraton Grand Seattle | Seattle, Washington, U.S.

More details and registration at https://www.siam.org/conferences/cm/conference/op23

Wednesday, March 29, 2023

Mosek and AMPL

We are pleased to announce that AMPL Inc has become a reseller of MOSEK.

Moreover, a new AMPL to MOSEK interface was made available in the AMPL/MP open-source library of solver interfaces.

Details can be found on the MOSEK for AMPL page.

Easter 2023

Due to Easter support and sales will be closed from Thursday, April 6th, until Monday, April 10th, both days inclusive.


The MOSEK team wishes everyone a peaceful Easter.

Tuesday, January 24, 2023

Mosek with AmpliconArchitect

AmpliconArchitect (AA), a Python tool for genomic reconstruction used in cancer biology is in the process of updating that will make it compatible with all versions of MOSEK.

Until recently the tool was dependent on an older MOSEK API which required the use of MOSEK 8, making the installation process a bit more cumbersome, and making AA dependent on the discontinued, less accurate general convex optimizer.

With the new updates installing MOSEK for use with AA will be as simple as "pip install mosek" and one gets easy access to MOSEK upgrades the same way. AA will use MOSEK to solve its optimization tasks with the more accurate, faster conic solver.

We would like to thank Jens Luebeck for guidance and help with the installation, running and testing the new software.

For the time being you can try this new version via Jens' fork of AmpliconArchitect, where the readme file also provides all the necessary details. As always you can get a MOSEK personal academic license for academic use of AA via our website.