Category: Miscellaneous
Flag: texsaw{w3ight5_t3ll_t4l3s}
Challenge Description
Neural networks are like onions - or was that ogres?
Analysis
The file was a Keras model saved as an HDF5 container, so the first useful question was not “can the model classify something?” but “what exactly is stored inside it?” Dumping the top-level groups and a few subkeys immediately showed ordinary model metadata alongside a suspicious layer named secret_layer. That name mattered more than the rest of the architecture, because challenge authors do not usually name a layer that unless they want you to look there.
import h5py
f = h5py.File('/tmp/ctfchan_model_heist/model.h5', 'r')
print('keys', list(f.keys()))
print('attrs', dict(f.attrs))
for k in f.keys():
g = f[k]
print('group', k, 'type', type(g))
if hasattr(g, 'keys'):
print(' subkeys', list(g.keys())[:20])python inspect_model.pykeys ['model_weights', 'optimizer_weights']
group model_weights type <class 'h5py._hl.group.Group'>
subkeys ['dense', 'dense_1', 'dense_2', 'flatten', 'secret_layer', 'top_level_model_weights']
group optimizer_weights type <class 'h5py._hl.group.Group'>
subkeys ['adam']Once secret_layer stood out, the next check was to see what tensors it actually contained. The layer had a bias vector of length 26 and a much larger kernel matrix with shape (128, 26), which is exactly the kind of place where someone could hide byte data without it being obvious in a quick strings pass.
import h5py
f = h5py.File('/tmp/ctfchan_model_heist/model.h5', 'r')
sl = f['model_weights']['secret_layer']['sequential']['secret_layer']
print('keys', list(sl.keys()))
for k in sl.keys():
d = sl[k]
print(k, d.shape, d.dtype)python inspect_secret_layer.pykeys ['bias', 'kernel']
bias (26,) float32
kernel (128, 26) float32At that point the challenge description started to make sense: this was less about machine learning and more about peeling back layers until the hidden payload showed up. The solve was to treat the floating-point weights as encoded byte data. The script below flattened each layer’s weights, multiplied them by a handful of scale factors, rounded them to integers, mapped them into byte values with % 256, and searched the result for the expected texsaw{...} flag pattern. The hit landed on the secret_layer kernel at scale 1000, which means the flag had been embedded directly into the model weights rather than produced by running inference.
import h5py
import numpy as np
import re
f = h5py.File('/tmp/ctfchan_model_heist/model.h5', 'r')
pattern = re.compile(rb'texsaw\{[^}]+\}')
for layer in ['dense', 'secret_layer', 'dense_1', 'dense_2']:
try:
K = f['model_weights'][layer]['sequential'][layer]['kernel'][()]
B = f['model_weights'][layer]['sequential'][layer]['bias'][()]
except Exception as e:
print(layer, 'err', e)
continue
for scale in [10, 50, 100, 200, 500, 1000]:
vals = np.rint(K.flatten() * scale).astype(int)
b = bytes([(v % 256) for v in vals])
m = pattern.search(b)
if m:
print(layer, 'scale', scale, 'found', m.group(0))
raise SystemExit
for scale in [10, 50, 100, 200, 500, 1000]:
vals = np.rint(B * scale).astype(int)
b = bytes([(v % 256) for v in vals])
m = pattern.search(b)
if m:
print(layer, 'bias scale', scale, 'found', m.group(0))
raise SystemExit
print('no pattern found')python solve.pysecret_layer scale 1000 found b'texsaw{w3ight5_t3ll_t4l3s}'The key detail was that the hidden bytes were not stored as obvious text inside the HDF5 file; they were encoded as scaled floating-point values inside the secret layer’s kernel. Once those values were rounded back into integers, the flag appeared intact.
Solution
import h5py
import numpy as np
import re
f = h5py.File('/tmp/ctfchan_model_heist/model.h5', 'r')
pattern = re.compile(rb'texsaw\{[^}]+\}')
for layer in ['dense', 'secret_layer', 'dense_1', 'dense_2']:
try:
K = f['model_weights'][layer]['sequential'][layer]['kernel'][()]
B = f['model_weights'][layer]['sequential'][layer]['bias'][()]
except Exception:
continue
for scale in [10, 50, 100, 200, 500, 1000]:
vals = np.rint(K.flatten() * scale).astype(int)
b = bytes([(v % 256) for v in vals])
m = pattern.search(b)
if m:
print(m.group(0).decode())
raise SystemExit
for scale in [10, 50, 100, 200, 500, 1000]:
vals = np.rint(B * scale).astype(int)
b = bytes([(v % 256) for v in vals])
m = pattern.search(b)
if m:
print(m.group(0).decode())
raise SystemExitpython solve.pytexsaw{w3ight5_t3ll_t4l3s}