Visual working memory maintains both continuous-perceptual information and discrete-categorical information about memory items. Both types of information are represented in working memory, but the representation structure remains unknown. Continuous and categorical information about a single stimulus could be represented separately, in two different representations. Alternatively, continuous and categorical information could be represented jointly as a single representation. To investigate this, we fitted two different computational models to delayed estimation data assuming either separate or joint representations of continuous and categorical information in working memory, for three different, commonly used features (orientation, color, and shape). Across a set of nine experiments, model fits clearly show that feature identity drives the representation structure, with a joint-representation structure for orientation, but a separate-representations structure for color and shape. This pattern was remarkably invariant across a variety of task contexts. Existing models miss this distinction, leading to mischaracterization of memory precision. (PsycInfo Database Record (c) 2023 APA, all rights reserved).