Coherent optical phase shift keying transmission and phase noise problem due to broad linewidth of semiconductor lasers are addressed. Two possible ways of reducing the phase noise effect by signal processing at the receiver are described. In one approach the integration time of the integrate-and-dump circuit before decision making is reduced, hence a trade-off between phase noise degradations reduction and reduced received bit energy.In the next approach, the decision tune is subdivided into smaller subintervals and decisions are made based on the information observed in each interval. It is found that in both cases, regardless of what the value of the bit rate-to-linewidth ratio is, the power penalty (compared to an ideal receiver) can be reduced, significantly.The notable finding is that, in the second approach the power penalty can be kept below 2 dB, for reasonable bit rate-to-linewidth values.