We study the extreme $$L_p$$
L
p
discrepancy of infinite sequences in the d-dimensional unit cube, which uses arbitrary sub-intervals of the unit cube as test sets. This is in contrast to the classical star $$L_p$$
L
p
discrepancy, which uses exclusively intervals that are anchored in the origin as test sets. We show that for any dimension d and any $$p>1$$
p
>
1
, the extreme $$L_p$$
L
p
discrepancy of every infinite sequence in $$[0,1)^d$$
[
0
,
1
)
d
is at least of order of magnitude $$(\log N)^{d/2}$$
(
log
N
)
d
/
2
, where N is the number of considered initial terms of the sequence. For $$p \in (1,\infty )$$
p
∈
(
1
,
∞
)
, this order of magnitude is best possible.