Generative artificial intelligence (GenAI) is becoming more prevalent in higher education, and with that comes opportunities and challenges. One opportunity is using this technology to help create educational material, but one challenge is that the output of these tools might produce biased content. Thus, for this work, three text-based GenAI tools (ChatGPT-4o, Microsoft Copilot, and Google Gemini) were used to develop an activity for an analytical chemistry laboratory course. In each response, the student names provided by the chatbots were quantified with respect to gender and broadly assessed for cultural representation. All three chatbots generated an equal percentage of female ("she/her") and male ("he/him") student names, but none of the chatbots used "they/them" pronouns, signaling a lack of inclusivity for nonbinary, gender-neutral, or gender-nonconforming individuals. The names provided by the chatbots were dominated by those popular in English-speaking countries, highlighting a lack of cultural diversity in the output provided. Both these biases could be mitigated by asking that the chatbots provide gender-inclusive names and names that represent diverse cultural backgrounds. As educators begin to utilize GenAI tools to create classroom materials or have students use this technology in their assignments, it is important to think about the potential biases that might emerge, share this limitation with those using these tools, and work to not perpetuate them.