Potential gender biases existing in Wikipedia's content can contribute to biased behaviors in a variety of downstream NLP systems. Yet, efforts in understanding what inequalities in portraying women and men occur in Wikipedia focused so far only on biographies, leaving open the question of how often such harmful patterns occur in other topics. In this paper, we investigate gender-related asymmetries in Wikipedia titles from all domains. We assess that for only half of gender-related articles, i.e., articles with words such as women or male in their titles, symmetrical counterparts describing the same concept for the other gender (and clearly stating it in their titles) exist. Among the remaining imbalanced cases, the vast majority of articles concern sports-and social-related issues. We provide insights on how such asymmetries can influence other Wikipedia components and propose steps towards reducing the frequency of observed patterns.Bias statement Inequalities in how men and women are represented in Wikipedia titles can be captured by NLP models and translate into biased behaviors creating representational harms (Blodgett et al., 2020). For example, if by default Wikipedia articles about national sports teams are about male teams, then a search engine might assume that a prototypical sportsperson is a man. Such a system when asked about famous volleyball players might exhibit recognition bias and return no women. Similarly, if Wikipedia provides special articles listing women photographers next to photographers, then an automatic knowledge extractor trained on such data might learn and propagate stereotypical generalizations that women within these occupations are an exception and should hold special qualities. Our work limits itself to binary gender values in extracting gender bias patterns from Wikipedia titles due to data scarcity. We acknow-ledge that not incorporating other values into our analysis indirectly causes recognition bias against non-binary people.