Computers are used in increasingly complex environments for increasingly complex tasks. An example hereof is the use of computer simulations in instruction. Simulation offers an environment in which learners have to extract information from the system and must construct their knowledge themselves. This requires a high level of control for the learner over the (complex) environment. The present study investigates the influence of two representation aspects of simulation environments on the way of interacting with a simulation and on resulting test performance. The first aspect is giving learners additional navigation support by providing them with separate overviews of input and output. The second aspect concerns the type of interface: a conversational interface vs. a direct manipulation interface. Subjects had to learn about a theory of decision support with the use of one of four versions of basically the same simulation. In a control condition subjects were directly confronted with the simulation model in the form of a formula. Results showed that navigation support did not raise the subjects' scores. To the contrary, subjects receiving navigation support tended to have lower test performance. Subjects who received navigation support made fewer iterations during the simulation than the other subjects and the number of iterations was related to test performance. An explanation for their low scores might be that the navigation support distracted the subjects from their main task: learning about the model by manipulating the simulation. The direct manipulation interface was successful in increasing the number of changes to model variables. This, however, neither increased nor lowered the subjects' test performance. As expected, the direct manipulation interface resulted in far more efficient learning compared with the conversational interface.