International audienceNowadays, the desire to embed more applications in systems as small as Smart Cards or sensors is growing. However, physical limitations of these systems, like very small main memory, and their cost of production make it very difficult to achieve. One solution is to execute code from a secondary memory, cheaper, denser, but slower, as NAND Flash. Solutions based on Demand Paging and using a cache in main memory, began to be proposed and implemented in the domain of mobile phones, but consume too much RAM yet, compared to what a Smart Card can provide. In this paper, we show that we can dramatically increase performance by reducing the size of pages in the cache. This solution then allows a more intelligent access to the NAND. We also show that our solution allows to use Demand Paging within the limits of Smart Cards memories, where a conventional approach, offering too low bandwidth, makes code execution impossible from this kind of secondary memory. Finally, we present important future keys to optimize our proposal even more, and specially off-line code specialization aware of NAND characteristics and advanced cache properties