When designing devices, we try to generate designs to optimize some objective function, such as computation time, power draw, spectrum, etc. In the past, solving design problems relied on the designers intuition, searching large design spaces or even sampling NP-Hard problems. To reduce the design problem's difficulty, we use latent models, such as LDMs, GANs, and VAEs, to extract symmetries and features in the data to construct a more efficient search space for design optimizers.
Generative modeling constructs probabilistic models often by exploiting symmetries, constraints, and physics in data to reduce what needs to be learned.
Graph networks are prevalent throughout mathematics, computer science, and engineering. Message passing, graph neural networks, and more recently graph transformers allow for the manipulation of graphs in machine learning.
Back in 2022, Michael Bezick joined our research group to work on machine learning.
Back in 2021, just as things were beginning to wind down from Covid, Sasha asked the group if anyone would be willing to chat with a potential new undergraduate researcher. I was just starting my third year but I knew I wanted to start a research lab to work on ML for quantum devices. Luckily, this bright new researcher was extremely optimistic about quantum. I decided to meet with her and it was evident she was beyond passionate about quantum technologies. I had never mentored an undergraduate student one-on-one before, let alone led research efforts (with Sasha's guidance of course). Congratulations to Vea for being accepted to the ECE PhD Program at Purdue :).
I traveled to Cambridge for Quantinuum's internal conference. Met some great scientists, including Adam Ollanik and the Colorado photonics team.
RAPTOR was published in Advanced Photonics. I was surprised, but grateful, of all the media coverage, including this one from Purdue with a great title. https://stories.prf.org/raptor-takes-a-bite-out-of-global-counterfeit-semiconductor-market/