Very interesting quiz. With some help of chatgpt, I managed to reach a solution that I believe is very good. I describe the process bellow. I used the free google collab.
I started by loading the dataset and converting the “Tax Assessed” column from mixed currency format (e.g., “4 gp 3 sp”) into a single numeric value in silver pieces. Then, I trained a machine learning model (Random Forest Regressor) to approximate the cost function f(a, b, c, d, e) based on item counts. After evaluating the model’s performance and confirming strong predictive accuracy (R² ≈ 0.99), I approached the partitioning problem: dividing a fixed multiset of items into 4 groups such that the sum of predicted costs is minimized. I used a greedy assignment algorithm to initialize the groups, followed by a single-item swap optimizer to improve local distributions. To prevent unrealistic zero-cost predictions for small groups, I used a safe_predict() function that enforces a minimum cost of 15 for any non-empty group, ensuring more balanced and reliable optimization. Finally, I applied a multi-item swap optimizer to refine the solution further. The result was a cost-efficient and balanced grouping, achieved without brute force, using practical heuristics and model-guided local search.
This solution gave me the following partitions (UPDATE: I reedit the spoiler but they do not seem to get activated. I started a < block followed by a ! as the first char in a line, but nothing happened. I even added the mobile spoiler tags)
Group 1: [0, 1, 0, 0, 3] | 40.00 Group 2: [0, 0, 5, 0, 1] | 50.00 Group 3: [3, 2, 0, 6, 3] | 131.59 Group 4: [1, 1, 0, 1, 1] | 15.00 Total cost: 236.59
Very interesting quiz. With some help of chatgpt, I managed to reach a solution that I believe is very good. I describe the process bellow. I used the free google collab.
I started by loading the dataset and converting the “Tax Assessed” column from mixed currency format (e.g., “4 gp 3 sp”) into a single numeric value in silver pieces. Then, I trained a machine learning model (Random Forest Regressor) to approximate the cost function
f(a, b, c, d, e)
based on item counts. After evaluating the model’s performance and confirming strong predictive accuracy (R² ≈ 0.99), I approached the partitioning problem: dividing a fixed multiset of items into 4 groups such that the sum of predicted costs is minimized. I used a greedy assignment algorithm to initialize the groups, followed by a single-item swap optimizer to improve local distributions. To prevent unrealistic zero-cost predictions for small groups, I used asafe_predict()
function that enforces a minimum cost of 15 for any non-empty group, ensuring more balanced and reliable optimization. Finally, I applied a multi-item swap optimizer to refine the solution further. The result was a cost-efficient and balanced grouping, achieved without brute force, using practical heuristics and model-guided local search.This solution gave me the following partitions (UPDATE: I reedit the spoiler but they do not seem to get activated. I started a < block followed by a ! as the first char in a line, but nothing happened. I even added the mobile spoiler tags)
Group 1: [0, 1, 0, 0, 3] | 40.00
Group 2: [0, 0, 5, 0, 1] | 50.00
Group 3: [3, 2, 0, 6, 3] | 131.59
Group 4: [1, 1, 0, 1, 1] | 15.00
Total cost: 236.59