Animal toxicity testing is widely used in the chemical, pharmaceutical and research industries as a part of preclinical laboratory testing of substances to ensure that chemicals are safe for humans and animals. Under preclinical conditions, the safe and effective dose, acceptable dose, and exposure threshold of a substance can be established using mammals and rodents. There are enough materials on the introduction of biomodels in experiments. To summarize this knowledge, a polythematic abstract-bibliographic base was analyzed. The main attention was paid to such scientometric databases as: Web of Science Core Collection, Medline, PubMed, RSCI, as well as eLIBRARY.ru portal data for the last 10 years. An assessment of the statistical data and publication activity of the authors was made for the query modeling and ethics in veterinary medicine and medicine.The aim of the study was to analyze modern models and prospects for using new biomodels for experiments in pharmacology and toxicology.Having established protocols is important to ensure consistency between different studies within the same model, ensuring reproducibility and repeatability in experiments.Based on the analysis of literature sources over the past 10 years, it can be concluded that despite the large number of regulatory framework and transgenic laboratory animals with embedded human target genes in their genome, it is not possible to obtain an identity. Probably, the main bias in predictive toxicology will be made on neural networks and computer replacement of existing biomodels. Since, despite their relative accuracy, maintenance and feeding are costly, and the field of using animals in experiments is increasingly criticized and requires new approaches.