Digital musical instruments are essential technologies in modern musical composition and performance. However, the interface of the synthesizer is not intuitive enough and require extra knowledge because of the parameters. To address this problem, we propose pseudo-intention learning: a novel data collection method for supervised learning in musical instrument development. Pseudo-intention learning collects a data set of the paired target tone and input performed by the user. We developed a conversion framework that reflects the composer’s intention by combining standard convolutional neural network and pseudo-intention learning. As a proof of concept, we constructed an interface that can freely manipulate the sound source of a digital snare drum and demonstrated its effectiveness with a pilot study. We confirmed that the tone parameters generated by our system reflected the user’s intention. We also discuss applying this method to richer musical expression.