Generative Pre-trained Transformer For Design Concept Generation: An Exploration | Awesome LLM Papers Contribute to Awesome LLM Papers

Generative Pre-trained Transformer For Design Concept Generation: An Exploration

Qihao Zhu, Jianxi Luo . Proceedings of the Design Society 2022 – 59 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Model Architecture

Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation.

Similar Work