Complex perceptual decisions, in which information must be integrated across multiple sources of evidence, are ubiquitous but are not well understood. Such decisions rely on sensory processing of each individual source of evidence, and are therefore vulnerable to bias if sensory processing resources are disproportionately allocated among visual inputs. To investigate this, we developed an implicit neurofeedback protocol embedded within a complex decision-making task to bias sensory processing in favor of one source of evidence over another. Human participants of both sexes (N = 30) were asked to report the average motion direction across two fields of oriented moving bars. Bars of different orientations flickered at different frequencies, thus inducing steady-state visual evoked potentials. Unbeknownst to participants, neurofeedback was implemented to implicitly reward attention to a specific "trained" orientation (rather than any particular motion direction). As attentional selectivity for this orientation increased, the motion coherence of both fields of bars increased, making the task easier without altering the relative reliability of the two sources of evidence. Critically, these neurofeedback trials were alternated with "test" trials in which motion coherence was not contingent on attentional selectivity, allowing us to assess the training efficacy. The protocol successfully biased sensory processing, resulting in earlier and stronger encoding of the trained evidence source. In turn, this evidence was weighted more heavily in behavioral and neural representations of the integrated average, although the two sources of evidence were always matched in reliability. These results demonstrate how biases in sensory processing can impact integrative decision-making processes.