Skip to main content

LEGO

The Problem Description
Problem Description Use Case Objectives AMPL HiGHS
SERIES INTRO! # This is the first of eighteen videos over which we build an app for the awesome Rebrickable® community, which finds multiple LEGO® sets buildable concurrently from a user’s part inventory.
Plan the Buildout
App Build Plan Python UI
We sketch a plan for building our optimization application, starting with a bare-bones optimization model and then adding a data connection and simple text-based UI. Later we swap out the AMPL scripting language for Python and the text-based UI for Streamlit, and then containerize the app.
Unbox the Data!
GraphXR EDA Schema
We walk through our LEGO data from Rebrickable.com and - using GraphXR for exploratory data analysis (EDA) - we translate its schema into an Entity Relationship Diagram, yielding our optimization app’s main entities! (We also meet Legolas Greenleaf, Frankie Poullain and The Darkness …)
Sketch the Model
Variables Objectives Constraints
Step-by-step we ideate model variables and constraints, jotting these down initially as natural language comments in our new AMPL model.mod file.
Connect Model to Data
Data Sources Speed Flat Files AMPL
We establish a connection via an import script (written in the AMPL scripting syntax, later to be replaced by a Python script) to a collection of CSV and flat text data files.
Group SETS
SETS Tuples Speed AMPL
We discuss why grouping sets of single entities into aggregate sets of tuples (aka ‘cartesian products’, ‘cross-joins’) dramatically increases the performance of our optimization app. Also, we discuss the accompanying increase in data memory footprint.
Add a Text-based User Interface
Flat Files UI AMPL
We add a simple but functional, attractive text-based user interface by way of a flat data file formatted for good readability and manual data input.
Write the Model
AMPL Constraints Memory Efficiency
One-by-one we convert our natural language comments representing model vars and constraints into actual model entities. In the process, we add display statements for health-checking the memory size and cardinality of our new model element.
Shrink the Problem
Speed Set Work Memory Efficiency AMPL
With constraints in place which ensure that selected LEGO® sets are complete, we now implement constraints limiting the search space to sets that the user desires. We then observe that, while functional, a faster-computing way to implement such ‘problem shrinking’ is via set manipulation rather than constraint imposition.
Make on-the-fly model mods
Constraints App Logic Model Assembly at Runtime AMPL
We add functionality to our script that drops unneeded constraints per user-specified options, effectively changing the form of our optimization model on-the-fly. We also discuss extreme approaches to such run-time model construction, in light of the modular nature of optimization model components.
Tune the Solver!!!
Solver Tuning Speed HiGHS
We explore the solving-time impact of our linear solver’s (HiGHS) various params/options, and use these to tune the solver for significantly faster performance on our optimization problem.
Switch our glue language to Python
Python App Logic AMPL
Having worked thus far in only the AMPL modeling and scripting languages respectively, we jettison the AMPL scripting language and replace it with Python. This transforms our model into a component callable from within Python via the AMPL-Python API.
Add a Streamlit User Interface
Streamlit UI AI-Assisted Coding
Now that our optimization app’s non-model logic script (aka ’the main script’) is in Python, we add a lovely Streamlit UI to replace our text-based UI, created with help from our cheerful AI coding assistant.
Go Dynamic! ...with iterative solving
Dynamic Optimization UI Streamlit AMPL
Our new Streamlit interface makes it easy to imagine and implement an important new feature: the ability to ‘pin’ one or more suggested LEGO® sets, discarding the others, and resolving, so that the user can iteratively arrive at an attractive collection of buildable sets.
Let's Containerize!
Containerization Docker Docker Compose Hardware
We are almost done with our Multi-Build optimization app! Now is a good time to containerize our app so that it can easily be deployed to various environments and even scaled across multiple heterogeneous environments at once.
Spec (rock!) the Hardware ;)
Hardware Speed
Now, let’s dream out loud about some fast hardware platforms to run our app. Physical memory allotment, CPU thread speed, hard disk speed, non-optimization process load, networking and remote database access bandwidth requirements … all impact our hardware specifications.
Mistakes, lies & abandoned features
Problem Description Use Case Mistakes Speed App Build Plan
I reveal some undisclosed turns taken during my initial development of this app. Notably: an abandoned attempt to implement a recursive search for nested sets within sets that would potentially have enabled some cool additional features, and which I might return to later.
RECAP! ...and what happens next
Speed Summary App Handover
A recap of the main points of optimization craft covered in this series, including every major speed-enhancing manuever. I also ponder out loud what interactions with the Rebrickable team might hold for the apps real-world deployment and how to approach this interaction as good citizens.