DBpedia is one of the most resourceful link databases today, and to access information in DBpedia databases, we need to use query syntax (e.g., SPARQL). However, not all users know SPARQL, so we must use a natural language query system to translate the user's query into the corresponding query syntax. Generating query syntax through the query system is both time-consuming and expensive. To improve the efficiency of query syntax generation from user questions, the multi-label template approach, specifically Light-QAwizard, is utilized. Light-QAwizard transforms the problem into one or more singlelabel classifications using multi-label learning template approaches. By implementing Light-QAwizard, query costs can be reduced by 50%, but it introduces a new label during the transformation process leading to sample imbalance, compromised accuracy, and limited scalability. To overcome these limitations, this paper employs employ two multi-label learning methods, namely Binary Relevance (BR) and Classifier Chains (CC), for question transformation. By employing Recurrent Neural Networks (RNNs) as a multi-label classifier for generating RDF triples, we predict all the labels that align with the query intentions. To better account for the relationship between RDF triples, we integrate BR into an ensemble learning approach, resulting in the Ensemble BR. Experimental results demonstrate that our proposed method outperforms previous research in terms of improving query accuracy.