Apr 25, 2024
2:15pm - 2:45pm
Room 322, Level 3, Summit
Austin Mroz1,Rebecca Greenaway1,Kim Jelfs1
Imperial College London1
Novel chemical systems are necessary to fully address the major global challenges facing humanity, including the climate emergency, resource scarcity, and energy consumption needs. Traditional chemical discovery initiatives are founded on intuition-guided, “trial-and-error" processes. Here, small, iterative changes to chemical structure and experimental conditions are made by the researcher.<sup>1</sup> This is significantly resource and time intensive; after each small modification, the molecule must be synthesized (often a trial-and-error process in itself) and properties measured. As a result, these workflows are associated with long timescales (~20 years) and high costs.<sup>2,3</sup> Recently, computation has accelerated this process via atomistic simulations and data-driven approaches.<sup>4,5</sup> Yet, present applications of computation are largely often to post-rationalization of experimental observations or high-throughput screening of manually-curated databases of hypothetical systems.<sup>1</sup><br/>The full utility of computation in the chemical discovery pipeline is only realized with the close integration of experiment and theory.<sup>6</sup> This could be achieved via closed-loop, experimental-theoretical workflows, which are poised to efficiently identify high-performing candidate compounds for target applications. Here, AI-driven theoretical predictions guide high-throughput, automated experiments and these experimental results are used to improve the accuracy of the predictive models in an iterative process, which is repeated until convergence criteria are met. Often this process is directed using data-driven optimization strategies.<br/>Despite the initial success of AI in closed-loop chemical discovery workflows, there still exist major challenges in their scalable implementation – the major bottlenecks being i) the degree of human intervention needed and ii) lack of methods to manage the number and quality of initial data points. This is further complicated by the necessary integration and accommodation of i) varying metadata formats, ii) non-compatible, proprietary characterization software, and iii) hands-on robotic platform calibration and manipulation, among others. Each of which needs to be explicitly addressed to realize coherent, automated closed-loop chemical discovery. Thus, data-driven solutions and supporting software are imperative to seamlessly close-the-loop in experimental-theoretical discovery workflows.<br/>We address these bottlenecks and present an integrated, experimental-theoretical workflow that leverages high-throughput, automated experiments, and abstract computational models with data-driven optimization strategies to drive towards viable supramolecular materials for gas storage and separation applications.<br/><br/>References<br/>[1] <i>J. Am. Chem. Soc., </i><b>2022</b>, 144, 18730; [2] <i>Acc. Chem. Res., </i><b>2020</b>, 53, 599; [3] <i>Energy Environ. Sci.</i>, <b>2022</b>, 15, 579; [4] <i>Chem. Rev., </i><b>2020</b>, 120, 8066; [5] <i>Chem. Rev.</i>, <b>2021</b>, 121, 9816; [6] <i>Adv. Mater</i>, <b>2021</b>, 33, 2004831.