Advancements in AI will soon enable tools for providing automatic feedback to American Sign Language (ASL) learners on some aspects of their signing, but there is a need to understand their preferences for submitting videos and receiving feedback. Ten participants in our study were asked to record a few sentences in ASL using software we designed, and we provided manually curated feedback on one sentence in a manner that simulates the output of a future automatic feedback system. Participants responded to interview questions and a questionnaire eliciting their impressions of the prototype. Our initial findings provide guidance to future designers of automatic feedback systems for ASL learners.