We address the problem of designing data structures in the presence of faults that may arbitrarily corrupt memory locations. More precisely, we assume that an adaptive adversary can arbitrarily overwrite the content of up to δ memory locations, that corrupted locations cannot be detected, and that only O(1) memory locations are safe. In this framework, we call a data structure resilient if it is able to operate correctly (at least) on the set of uncorrupted values. We present a resilient dictionary, implementing search, insert and delete operations. Our dictionary has O(log n + δ) expected amortized time per operation, and O(n) space complexity, where n denotes the current number of keys in the dictionary. We also describe a deterministic resilient dictionary, with the same amortized cost per operation over a sequence of at least δ ǫ operations, where ǫ > 0 is an arbitrary constant. Finally, we show that any resilient comparison-based dictionary must take Ω(log n+δ) expected time per search. Our results are achieved by means of simple, new techniques, which might be of independent interest for the design of other resilient algorithms.