Browse Source

Fix duplicate sample detection at chunks size limit

Before cutting a new XOR chunk in case the chunk goes over the size
limit, check that the timestamp is in order and not equal or older
than the latest sample in the old chunk.

Signed-off-by: György Krajcsovits <gyorgy.krajcsovits@grafana.com>
pull/12874/head
György Krajcsovits 1 year ago
parent
commit
96d03b6f46
  1. 3
      tsdb/head_append.go

3
tsdb/head_append.go

@ -1283,6 +1283,9 @@ func (s *memSeries) appendPreprocessor(t int64, e chunkenc.Encoding, o chunkOpts
c = s.cutNewHeadChunk(t, e, o.chunkRange) c = s.cutNewHeadChunk(t, e, o.chunkRange)
chunkCreated = true chunkCreated = true
} else if len(c.chunk.Bytes()) > maxBytesPerXORChunk { } else if len(c.chunk.Bytes()) > maxBytesPerXORChunk {
if c.maxTime >= t {
return c, false, false
}
c = s.cutNewHeadChunk(t, e, o.chunkRange) c = s.cutNewHeadChunk(t, e, o.chunkRange)
chunkCreated = true chunkCreated = true
} }

Loading…
Cancel
Save