Skip to content
Discussion options

You must be logged in to vote

I've figured it out, I was using the wrong bounds on my optimize_acqf and didnt unnormalize the candidate before giving it to my simulation package.

This fixed it:

candidates, _ = optimize_acqf(
    acq_function=acq_func,
    bounds=torch.tensor([[0.0], [1.0]], dtype=torch.double),  # normalized bounds!
    q=BATCH_SIZE,
    num_restarts=NUM_RESTARTS,
    raw_samples=RAW_SAMPLES,
)
# Unnormalize for real-world use
new_x_train = unnormalize(candidates, boundszfT).detach()
params['laser_params']['zf'] = float(new_x_train)

Now that it works I have the same spikes as with the nitrogen density one. I guess i should investigate further. Its also weird that although i used the wrong bounds and …

Replies: 2 comments 4 replies

Comment options

You must be logged in to vote
1 reply
@Friedemannn
Comment options

Comment options

You must be logged in to vote
3 replies
@Balandat
Comment options

@Friedemannn
Comment options

@Balandat
Comment options

Balandat Sep 1, 2025
Collaborator

Answer selected by Friedemannn
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants