Category: AI
Flag: apoorvctf{l0r4_m3rg3}
Challenge Description
Two files. One network.
You’re handed a base model and an adapter. Alone, they’re meaningless. Together… well, that’s for you to figure out.
Find the flag.
Analysis
The challenge handed over base_model.pt and lora_adapter.pt with the hint that they only make sense together, so the first thing to verify was whether these were standard PyTorch checkpoints and what each one actually contained.
file /home/rei/Downloads/base_model.pt /home/rei/Downloads/lora_adapter.pt/home/rei/Downloads/base_model.pt: Zip archive data, made by v0.0, extract using at least v0.0, last modified, last modified Sun, ? 00 1980 00:00:00, uncompressed size 682, method=store
/home/rei/Downloads/lora_adapter.pt: Zip archive data, made by v0.0, extract using at least v0.0, last modified, last modified Sun, ? 00 1980 00:00:00, uncompressed size 257, method=storeBoth files were zip-backed checkpoints, so listing members was enough to confirm they looked like serialized tensors rather than some custom container.
unzip -l /home/rei/Downloads/base_model.ptArchive: /home/rei/Downloads/base_model.pt
Length Date Time Name
--------- ---------- ----- ----
682 00-00-1980 00:00 base_model/data.pkl
1 00-00-1980 00:00 base_model/.format_version
2 00-00-1980 00:00 base_model/.storage_alignment
6 00-00-1980 00:00 base_model/byteorder
65536 00-00-1980 00:00 base_model/data/0
1024 00-00-1980 00:00 base_model/data/1
262144 00-00-1980 00:00 base_model/data/2
1024 00-00-1980 00:00 base_model/data/3
131072 00-00-1980 00:00 base_model/data/4
512 00-00-1980 00:00 base_model/data/5
5120 00-00-1980 00:00 base_model/data/6
40 00-00-1980 00:00 base_model/data/7
2 00-00-1980 00:00 base_model/version
40 00-00-1980 00:00 base_model/.data/serialization_id
--------- -------
467205 14 filesunzip -l /home/rei/Downloads/lora_adapter.ptArchive: /home/rei/Downloads/lora_adapter.pt
Length Date Time Name
--------- ---------- ----- ----
257 00-00-1980 00:00 lora_adapter/data.pkl
1 00-00-1980 00:00 lora_adapter/.format_version
2 00-00-1980 00:00 lora_adapter/.storage_alignment
6 00-00-1980 00:00 lora_adapter/byteorder
65536 00-00-1980 00:00 lora_adapter/data/0
65536 00-00-1980 00:00 lora_adapter/data/1
2 00-00-1980 00:00 lora_adapter/version
40 00-00-1980 00:00 lora_adapter/.data/serialization_id
--------- -------
131380 8 filesAt this point the important question was: which layer does the adapter patch? Loading both checkpoints made that explicit.
python -c "import torch; b=torch.load('/home/rei/Downloads/base_model.pt',map_location='cpu',weights_only=True); l=torch.load('/home/rei/Downloads/lora_adapter.pt',map_location='cpu',weights_only=True); print('BASE_KEYS'); [print(k, tuple(v.shape)) for k,v in b.items() if hasattr(v,'shape')]; print('LORA_KEYS'); [print(k, tuple(v.shape)) for k,v in l.items() if hasattr(v,'shape')]"BASE_KEYS
layer1.weight (256, 64)
layer1.bias (256,)
layer2.weight (256, 256)
layer2.bias (256,)
layer3.weight (128, 256)
layer3.bias (128,)
output.weight (10, 128)
output.bias (10,)
LORA_KEYS
layer2.lora_A (64, 256)
layer2.lora_B (256, 64)That shape pairing is the classic LoRA decomposition where the update is B @ A, and here that product lands exactly on layer2.weight (256x256). The challenge title/hint made sense immediately: the secret should appear only after merging base + adapter. This part felt clean and elegant because the dimensions lined up perfectly with no guessing.

Merging and clamping that matrix into byte range (0..255) exposed a sparse visual payload hidden in the weights. I extracted the nonzero bounding box and upscaled it to make the embedded text legible.
python -c "import torch,numpy as np; from PIL import Image; b=torch.load('/home/rei/Downloads/base_model.pt',map_location='cpu',weights_only=True); l=torch.load('/home/rei/Downloads/lora_adapter.pt',map_location='cpu',weights_only=True); M=b['layer2.weight'].numpy()+(l['layer2.lora_B'].numpy()@l['layer2.lora_A'].numpy()); U=np.rint(np.clip(M,0,1)*255).astype(np.uint8); ys,xs=np.where(U!=0); x0,x1,y0,y1=int(xs.min()),int(xs.max()),int(ys.min()),int(ys.max()); crop=U[y0:y1+1,x0:x1+1]; Image.fromarray(crop).resize((crop.shape[1]*8,crop.shape[0]*8),resample=Image.NEAREST).save('/home/rei/Downloads/hefty_payload/crop_nonzero_x8.png'); print('shape',U.shape); print('nonzero_bbox',(x0,y0,x1,y1)); print('crop_shape',crop.shape); print('saved','/home/rei/Downloads/hefty_payload/crop_nonzero_x8.png')"shape (256, 256)
nonzero_bbox (27, 119, 228, 143)
crop_shape (25, 202)
saved /home/rei/Downloads/hefty_payload/crop_nonzero_x8.pngBefore this, I also tried scanning generated images as QR and searching raw streams for common file magic bytes, which was a troll detour and produced nothing useful.

Reading the upscaled crop manually gives the final flag string directly: apoorvctf{l0r4_m3rg3}.
Solution
import torch
import numpy as np
from PIL import Image
base = torch.load('/home/rei/Downloads/base_model.pt', map_location='cpu', weights_only=True)
lora = torch.load('/home/rei/Downloads/lora_adapter.pt', map_location='cpu', weights_only=True)
merged = base['layer2.weight'].numpy() + (lora['layer2.lora_B'].numpy() @ lora['layer2.lora_A'].numpy())
u8 = np.rint(np.clip(merged, 0, 1) * 255).astype(np.uint8)
ys, xs = np.where(u8 != 0)
x0, x1 = int(xs.min()), int(xs.max())
y0, y1 = int(ys.min()), int(ys.max())
crop = u8[y0:y1+1, x0:x1+1]
Image.fromarray(crop).resize((crop.shape[1]*8, crop.shape[0]*8), resample=Image.NEAREST).save('crop_nonzero_x8.png')Open crop_nonzero_x8.png manually and read the rendered text.
apoorvctf{l0r4_m3rg3}