Recently, Satellite Internet of Things (SIoT), a space network that consists of numerous Low Earth Orbit (LEO) satellites, is regarded as a promising technique since it is the only solution to provide 100% global coverage for the whole earth, without any additional terrestrial infrastructure supports. However, compared with Geostationary Earth Orbit (GEO) satellites, the LEO satellites always move very fast to cover an area within only 5-12 minutes per pass, bringing high dynamics to the network access. Furthermore, to reduce the cost, the power and spectrum channel resources of each LEO satellite are very limited, i.e., less than 10% of GEO. Therefore, to take fully advantage of the limited resource, it is very challenging to have an efficient resource allocation scheme for SIoT. Current resource allocation schemes for satellites are mostly designed for GEO, and these schemes do not consider many LEO specific concerns, including the constrained energy, the mobility characteristic, the dynamics of connections and transmissions etc. Towards this end, we proposed DeepCA, a novel reinforcement learning based approach for energy-efficient channel allocation in SIoT. In DeepCA, we firstly introduce a new sliding block scheme to facilitate the modeling of dynamic feature of the LEO satellite, and formulate the dynamic channel allocation problem in SIoT as a Markov decision process (MDP). We then propose a deep reinforcement learning algorithm for optimal channel allocation. To accelerate the learning process of DeepCA, we utilize the image form to represent the requests of users to reduce the input size, and carefully divide an action into multiple mini-actions to reduce the size of the action set. Extensive simulations show that our proposed DeepCA approach can save at least 67.86% energy consumption compared with traditional algorithms. INDEX TERMS Energy efficient, channel allocation, artificial intelligence, reinforcement learning, Internet of Things.