We describe how to alleviate the problem of plateau phenomena that may arise in learning with a CANFIS neuro-fuzzy modular network. The network model consists of multiple "local-expert" MLPs (multilayer perceptrons) mediated by fuzzy membership functions. Even with such a complex modular architecture, our recently-developed secondorder stagewise backpropagation procedure efficiently evaluates the Hessian matrix of a given objective function, for which we employ the sum-squared-error measure. For concreteness, we use a small curve-fitting problem that allows us to demonstrate detailed analysis based on the Hessian matrix. In particular, we describe how to use a block-diagonal approximate local Hessian matrix associated with a bottleneck local-expert MLP that does not perform very well for a designated task. Since the "bad" performance implies relatively large residuals (or errors), the local Hessian matrix tends to be indefinite; therefore, it is worth exploiting the negative curvature to escape from the plateaus.