Query Auto-completion (QAC) is a prominently used feature in search engines, where user interaction with such explicit feature is facilitated by the possible automatic suggestion of queries based on a prefix typed by the user. Existing QAC models have pursued a little on user interaction and cannot capture a user's information need (IN) context. In this work, we devise a new task of QAC applied on an image for estimating patch (one of the key components of Information Foraging Theory) probabilities for query suggestion. Our work supports query completion by extending a user query prefix (one or two characters) to a complete query utilising a foraging-based probabilistic patch selection model. We present iBERT, to fine-tune the BERT (Bidirectional Encoder Representations from Transformers) model, which leverages combined textual-image queries for a solution to image QAC by computing probabilities of a large set of image patches. The reflected patch probabilities are used for selection while being agnostic to changing information need or contextual mechanisms. Experimental results show that query autocompletion using both natural language queries and images is more effective than using only language-level queries. Also, our fine-tuned iBERT model allows to efficiently rank patches in the image.