We introduce mathematical objects that we call "directional fibers," and show how they enable a new strategy for systematically locating fixed points in recurrent neural networks. We analyze this approach mathematically and use computer experiments to show that it consistently locates many fixed points in many networks with arbitrary sizes and unconstrained connection weights. Comparison with a traditional method shows that our strategy is competitive and complementary, often finding larger and distinct sets of fixed points. We provide theoretical groundwork for further analysis and suggest next steps for developing the method into a more powerful solver.